Chairman, Quality Online Research Pty Ltd / CEO, Australia Online Research / Chairman, Research Panel.
Brian Fine is the founder and past Chairman of amrinteractive, The ORU, STW Insights, Harris Interactive Australia, J.D. Power Australia/NZ and Jupiter Media Metrix Australia.
He was responsible for managing the firm’s day-to-day business activities, including developing and managing its growing array of studies and reports, as well as providing strategic input on studies in the areas of internet research, advertising effectiveness measurement, brand equity and customer satisfaction research.
Brian is a fellow of the Australian Market Research Society and is QPMR accredited, an Honorary Life Fellow of the International Marketing Institute of Australia, a full member of ESOMAR, and a previous Australian representative of ESOMAR for 7 years. He was National Chairman of the Market Research Society of Australia for four years, President of AMSRO and is currently on its committee responsible for Quality Standards.
Brian has been an Adjunct Professor in the Business School of the University of Technology of Sydney since 2006 and has previously served, for over a decade, on its advisory panel for Bachelor of Business .
Brian was also jointly responsible for launching the Yankelovich Social Values tracking, systems in Australia, now known as Australia Scan as well as partnering with the Reputation Institute to bring to Australia Corporate Reputation tracking now branded as Reptrak. He set up, in partnership with Media Metrix, one of the first online audience measurement systems in Australia.
As the Australian partner of Harris Interactive, Brian has been working in the area of building and managing online panels since 1999. He has tested techniques, in Australia, such as Propensity weighting to migrate research studies from CATI to online and quota controlling and weighting online surveys using geoTribes to align online surveys to the Census. He has extensively worked in the area of calibration to align online surveys with other modes of data collection and hybrid surveys.
The world is advancing technology at an alarming pace and the need for speed, depth, reduced costs and self-learning abounds. MR is taking longer and longer when compared to the client’s internal analytics teams. It is not uncommon for larger clients to have over 100 data scientists crunching away on databases and modelling client interactions in Australia alone. Coupled with human analysts are an increasing incidence of AI algorithms which really speed analytics up. There is a bit of a way to go in terms of closing the loop of what MR offers and what transaction analytics offer, but that gap is closing fast!
An effective analytics team today can tell us more about a digital campaign and which segment bought at a particular price, the demographics, the quantity, payment method, next-best-action and repeat purchase propensity instantly i.e., on-the-fly. No groups, no research, just facts hard and fast! This alone should be a worrying issue for MR. We’ve seen the advent of technology hit the MR shores by way of online data collection and advanced analytics in the form of machine learning. AI, which is basically machines doing things that one would consider smart, now takes centre-stage and as an industry we need to be up to date! This paper uncovers four very important and critical areas of big data and AI that as an industry we need to urgently address.
Many online data suppliers though do not fully understand the issue of data quality. Factors that affect data quality such as memory decay, respondent inability to articulate, sensitivity bias and fraud go largely undetected. In addition, economic constraints do not often allow for the deletion of sample based on problematic observations evidenced in response sequencing as one example.
This paper presents a solution to the ever-growing problem of data quality and discusses the following:
1. We firstly discuss why data quality is of paramount importance, and a key responsibility of the market researcher;
2. We secondly highlight key factors and common misconceptions that may affect data quality;
3. Thirdly, using case studies, we illustrate some quick checks to detect anomalous response patterns which denote a poor respondent response pattern; and
4. We provide some approaches for addressing quality concerns, and the statistical tools and software packages necessary.