Image
Image

ESOMAR 37

ESOMAR is the global voice of the data, research and insights community
–a truly international association that provides ethical and professional guidance. Its document, Questions to Help Buyers of Online Samples, requires sample providers to expose their level of consistency, reliability and commitment to transparency when it comes to quality data.

Company Profile

1

What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?

DatAction is an online panel research provider offering comprehensive market research services across the EU and Middle Asia. All our panels are meticulously managed and hosted on our proprietary platform, ensuring complete control and uncompromised reliability.

Our advanced recruitment methods, detailed profiling, and robust sample management technologies provide access to over 1 million respondents from more than 10 countries, including hard-to-reach regions like Armenia, Azerbaijan, Moldova, and Uzbekistan.

Since 2013, DatAction has specialized exclusively in online panel research. Over time, we have completed more than 8,000 projects and achieved over 10 million completed questionnaires.

Our clients include many leading international market research companies, media agencies, and well-known brands.

2

Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge in this area? What sort of training in sampling techniques do you provide for your frontline staff?

Yes, our Panels Management team continuously monitors the performance of the company’s automated sampling algorithms. These algorithms are developed using best practices in sampling theory to minimize both source bias and panel usage risks. Members of this team work hand in hand with the Operations team to gain valuable insights and understand sampling needs.

3

What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?

DatAction offers sample-only services exclusively. We give the strong attention to provide the best coverage and representativity for our panels, to be ‘a local trusted pal’ to our panelists. All our panels are localized to take into consideration specifics of each country - design, language, local incentive withdrawal options, localized profile - all this allows us to be a trusted partner.

Sample Sources & Recruitment

4

Using the broad classifications above, from what sources of online sample do you derive participants?

DatAction has its own proprietary panel community. These carefully developed and maintained panels are the cornerstone of our data collection capabilities. We recruit panelists through various methods, including online media campaigns and social media.

We also have contracts with other local and global suppliers. If our own panel resources are insufficient due to specific requirements or project design, we can utilize our partners' panels. Any such additions are agreed upon with clients in advance.

5

Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer?

DatAction's proprietary panels serve as our primary source of high-quality data, ensuring consistency and reliability for our clients. Approximately 93% of our cases are conducted using these panels, while the remaining 7% utilize our partners' panels. We apply a rigorous selection process to ensure that the quality of our partners' data meets our standards and provides accurate insights.

6

What recruitment channels are you using for each of the sources you have described? Is the recruitment process‘ open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?

We strive to utilize a wide range of recruitment sources to ensure the best audience representation. These sources include social networks (both local and country-specific), contextual and search advertising, postings in local communities and media, email campaigns, advertising in messaging apps, digital and mobile advertising, and all other available channels. While our recruitment methods are generally consistent across all geographic markets, the proportions may vary based on specific market characteristics, aiming to cover both local and global media.

Our panels are non-probabilistic and welcome participants from all socio-demographic backgrounds in line with ESOMAR guidelines and national laws. We do not reject registrations based on specific demographic criteria during the registration process.

Additionally, we run a referral program that allows our existing panelists to invite their friends to join our online panels. Users acquired through referrals constitute about 15% of our panelists, varying by country.

7

What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are? Describe this both in terms of the practical steps you take within your own organization and the technologies you are using. Please try to be as specific and quantify as much as you can

We have three major groups of sample quality checks and balances:

  1. Recruiting/registration check When someone is making a new registration, the panel management system checks if the registration email is already assigned to any existing account, and notify if the email is already used. Then the panel management system checks if the new email’s domain is included in ‘disposable email’ domains or deceitful domains (services that provide an email address for several minutes or hours just for registration purposes) blacklist. We do not allow registration with temporary emails. To proceed with the registration process a confirmation email is sent for double opt-in. When accepting the confirmation email, for the next step, the system is using 3rd party software to create a digital fingerprint. It contains a set of data that identifies browser setup and device as unique to the new panelist comparing it with all existing ones. In case similar snapshots are found the system puts a ‘do not contact’ flag for a new registration account and it will be excluded for all statistics and invitation sendouts in the future.

  2. Pre-survey check DatAction has a panelists’ ‘trust rating’ for all online panels. All new panelists have 100 points ‘trust rating’ as we do believe that all of them are honest and considerate respondents. In case if the panelist is being disqualified from the survey, his rating is being decreased for 10-30 points depending on the disqualification reason (fraudulent answers in open ended questions, speeding, straight-lining, etc). An out-of-range rating results in the participant being blocked from our sample.

  3. During-survey check We use cookies and captcha verification systems to avoid fraudulent attempts.

  4. Post-survey check DatAction always asks clients to provide the list of accepted and fraudulent IDs with reasons for deletion. The system sends out information emails for the removed respondents to clarify the reason for incentive removal and survey completion guidelines.

8

What brand (domain) and/or app are you using with proprietary sources? Summarize, by source, the proportion of sample accessing surveys by mobile app, email or other specified means.

About 90% of our panelists access surveys through our web portal and receive invitations by email. These channels play a vital role in engaging panelists. The remaining 10% of our panelists chose to receive survey invitations through wide used messengers. The proportion varies at a panelist demographic, country, and project level, but overall, we see 40% of traffic coming from email notifications, 50% from our website, and 10% from the messengers. We use fully localized domains for online panels in each country, so if you are interested in receiving a list of our panel domains, please, send us an inquiry to rfq@dataction.eu

9

Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration?

At DatAction, we provide a Managed service, where our project manager executes and manages the project. We also offer API integration for those who wish to have a direct line to our deeply profiled online panels. We continuously explore new opportunities in this area, remaining open to integrating with other platforms.
10

If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so, how? Do you have any integration mechanisms with third-party sources offered?

Ninety-three percent of all the research we conduct is based on our proprietary panels. Depending on the study design and project specifics, we may supplement with data from our partners. All additional sources are carefully preselected to ensure data quality, and their usage is pre-agreed with the client.

We have a system in place to allocate quotas per partner and control sample distribution. To avoid respondent duplication, we use cookies and unique links.

11

Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop only questionnaires? Is it suitable to recruit for communities? For online focus groups?

We supply sources for online questionnaires only. Samples can be provided for shorter or longer questionnaires, for mobile or desktop only. We can perform longitudinal studies and provide access to the same respondents or exclude them for consecutive waves of the research. We do not conduct online groups or communities.

Sampling & Project Management

12

Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend?

Once a client commissions a sampling project with us, our first step is to identify the specific target population they wish to survey. We carefully match panelists to the client's target criteria and distribute them according to pre-set quotas to ensure balanced representation of all segments within the target group.

After selecting the appropriate panelists, we send them survey invitations via email and social media notifications. We strategically time and size these invitations, taking into account factors such as response and drop-out rates, to ensure we meet the required number of completed survey responses within the allotted fieldwork period.

13

What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?

We collect and store detailed information about our panelists, covering more than 200 parameters to achieve precise targeting. These parameters include:

  • Demographics: Region, gender, age, marital status, household details, income and family finances, education, profession, and occupation.

  • Consumption: Household items owned, shopping behavior, and categories of goods and services consumed.

  • Travel and Leisure: Traveling habits and leisure activities.

  • Health: Health-related details.

Detailed profiling allows us to effectively match panelists with research projects, enhancing the quality and relevance of the data collected. Some parameters are localized based on the country of residence and its specifics, such as broadband and mobile providers, banks, and income strata. This localization helps achieve better precision in targeting and accommodates regional differences in geographically extensive studies.

We can supply some of the points as addends in the data set on request.

14

What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?

To provide a feasibility estimate, we require the following minimum set of parameters: number of completes, quota structure, if applicable, and length of interview (LOI). For more precise estimations, especially when targeting more specific audiences, additional information such as qualifying screening criteria and estimated incidence rates provided by the client is helpful.

In cases of complex study designs or tracking studies, we need to be informed about additional parameters, such as exclusivity periods and the number of waves

15

What do you do if the project proves impossible for you to complete in the field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?

We aim to inform the client in advance if the sample is insufficient and there is a risk of not completing the fields. This is monitored by the project manager and a built-in warning system in the program. In such cases, after aligning the approach with the client, we can:

  • Add our pre-authorized and verified suppliers to obtain a sufficient sample

  • Use the client’s preferred supplier

  • Consider changing some of the qualifying criteria

When working with our partners, we ensure the quality of the sample and mitigate duplications. Based on ISO standards, we exercise strict guidelines to select partners. Judging criteria include, but are not limited to, data quality, sample feasibility, pricing, timeliness, reliability and safety.

16

Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer.

We do not use the router. However, for each quota, we create a sample and forecast potential completions. We then activate an automated distribution system to achieve the required number of interviews. Randomization ensures that everyone has an equal chance of receiving the survey, with priority given to those with high ratings. Invitations to do a survey are sent out via email and social media messanger.

We also adhere to tracking limitations by collecting subsequent waves according to the specified requirements. Additionally, we observe moratoriums on excluding participants from surveys within the same category for the recent period.

Data quality is our highest priority, so we avoid overloading respondents and only assign them to surveys for which they are perfectly suited.

17

Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?

Instead of using a pre-screener, we simply notify respondents that they qualify for a survey based on their profile. They are then granted direct access to the client's survey link.

18

What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?

We use the same approach whether panelists access the survey through their personal account or a unique link. To ensure panelists are prepared for the survey, we provide them with the following information: length of interview (LOI), reward, type of device required (mobile or desktop), and any applicable technical requirements (e.g., longitudinal study over several days, access to a camera). To maintain the integrity of the responses, we do not provide any information that could be used to pass the screener or give hints about the category or brand.

19

Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?

Survey participation can occur through either a general format direct invitation or via the respondent's personal account where only the LOI, reward details, and technical requirements are stated.

20

What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?

While we can adjust incentives dynamically during fieldwork and this adjustment is reflected at the individual level, we refrain from using higher incentives to encourage respondent participation in surveys to avoid potential long-term behavioral bias.

21

Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?

As part of our regular procedure, participants who successfully complete a survey are invited to provide feedback on their overall experience. They rate their experience on a scale of 1 to 5, which helps us assess their satisfaction with the survey process. Moreover, we offer an open-ended question for participants to share additional comments or feedback if they choose. This information can be shared with the client upon request.

22

Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?

We provide a comprehensive debrief report including incidence rates, respondent completion statuses, quotas, and metrics influencing the report's cost. Additional metrics can be included in the debrief report upon client agreement.

23

How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?

By default, each panel member can receive a maximum of three survey invitations per week. Additionally, upon client request, we can implement quarantine or elimination rules that may pertain to specific time frames, survey topics, or individual projects. For instance, in tracking surveys, it is often necessary to eliminate completed interviews from the past three or six months, a capability we fully support as needed.

24

What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel etc? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records? At Dataction, we maintain extensive records of panelists' interactions with our platform. This includes details such as the registration campaign they joined through, their registration date, a complete history of their participation in various projects, information on gift redemptions, the accumulation and redemption of points, and the corresponding transaction dates. It's important to note that any data appended to participant responses when delivering the datafile strictly adheres to our privacy policy and excludes any personally identifiable information (PII).

We can share certain data with clients, but each case is discussed individually. We ensure that any shared information complies with GDPR guidelines and is not protected by non-disclosure agreements (NDAs).

25

Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.

At Dataction, we implement three key controls to verify the authenticity and identity of panelists, ensuring high-quality data collection:

  1. Duplicate Control: We use a fingerprinting system and unique cookies to identify and reject duplicate responses, guaranteeing that each panelist provides unique data.

  2. Captcha: Panelists must complete a captcha verification to distinguish human participants from bots, ensuring genuine responses.

  3. Sex/Age Verification: We include questions about sex and age in all surveys and compare responses with recruitment data to validate demographic information.

These measures work together to authenticate panelists, prevent duplicate entries, and verify essential demographics, ensuring reliable and credible data for our clients.

26

How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?

We can monitor each source within our system, managing partner contributions and allocating and controlling sample structure.

We can share this data with clients on request upon project completion.

27

Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?

We have implemented a panelist ranking system based on their behavior within the panel. Each panelist starts with 100 points upon joining. The system monitors performance, deducting points for non-compliant behavior such as low-quality answers, neglecting open-ended questions, or inconsistencies with the speed checker. Panelists with the highest rankings are given priority when assembling survey samples.

Additionally, we promote secure methods like S2S (Server-to-Server) and Hashing to ensure end-to-end security.

28

For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviors, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item nonresponse (e.g.,“ Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?

To maintain high-quality data, we implement several standard quality checks in our survey templates to ensure participant engagement and accuracy:

  1. Automated Checks: Detect illogical, inconsistent, and non-responses.

  2. Trap Questions: Verify attentiveness by including questions with only one correct answer, screening out incorrect responses.

  3. Minimum Survey Length: Set a time threshold for survey completion; responses below this threshold are excluded.

  4. Age and Gender Validation: Cross-check participant-provided data with our records for consistency.

  5. Mandatory Questions: Require responses to all questions for comprehensive data.

Additionally, we offer customizable checks based on client-designed questionnaires:

  1. Real-Time Open-Ended Checks: Ensure quality open-ended responses.

  2. Logic Checks: Identify inconsistencies between related questions.

  3. Grid Quality Checks: Detect suspicious patterns in grid-style questions.

These checks ensure data integrity and provide high-quality insights for our clients' research projects. As the result clients get clean, ready-to-use data.

Policies & Compliance

29

Please provide the link to your participant privacy notice (sometimes referred to as a privacy policy) as well as a summary of the key concepts it addresses. (Note: If your company uses different privacy notices for different products or services, please provide an example relevant to the products or services covered in your response to this question).

Can be provided upon the request. Is inline with the GDRP policies and local legislation.

30

How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer?

  1. We adhere to the EU General Data Protection Regulation (GDPR), which represents one of the most stringent legal standards for data protection and to Estonian internal regulations.

  2. We are using consent as the legal basis, and we while obtaining the consent we ensure that the process in compliant with GDPR. We store the proof of received consent.

  3. For the cross-border transfer, data retention and addressing data-breach we use internal policies developed and maintained in compliance to GDPR legislation.

  4. We have appointed DPO.
31

How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants? In your response, please address the sample sources you wholly own, as well as those owned by other parties to whom you provide access.

All policies are available to participants on the panel's website. Upon joining, participants confirm that they have read the policies and consent to the collection, storage, and processing of their data. Panelists can withdraw their consent at any time through their personal account on the website or by contacting the online helpdesk for assistance.

32

How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?

We adhere to the laws of the country where our services are provided and where payments are made to data subjects. Any changes to relevant laws and regulations are promptly followed and implemented by our legal and accounting departments.

33

What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?

Currently we are not formally certified by the external agency, we can confirm that all our internal policies are inline with the GDRP and the legislations of the counties, where samples are gathered.

34

Do you implement“ data protection by design” (sometimes referred to as“ privacy by design”) in your systems and processes? If so, please describe how.

Yes, we implement “data protection by design” as a ground principle and therefore have implemented and running data-protection policy that is fully adhere to the GDPR and Esomar guidelines and is compliant with the legislation of the panelists country of residence.

35

What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?

We adhere to ISO 27001 standards, conducting regular audits and risk-assessments as well as stress tests on our internal information system.

36

Do you certify to or comply with a quality framework such as ISO 20252?

We are not ISO 20252 certified by we hereby confirm that we adhere to framework and quality standards.

Metrics

37

Which of the following are you able to provide to buyers, in aggregate and by country and source? Please include a link or attach a file of a sample report for each of the metrics you use.

+1. Average qualifying or completion rate, trended by month
+2. Percent of paid completes rejected per month/project, trended by month
+3. Percent of members/accounts removed/quarantined, trended by month
+4.Percent of paid completes from 0-3 months tenure, trended by month
+5. Percent of paid completes from smartphones, trended by month
+6. Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month
+7. Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort)
+8. Average number of paid completes per member, trended by month (potentially by cohort)
+9. Active unique participants in the last 30 days
+10. Active unique 18-24 male participants in the last 30 days
+11. Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview
+12. Percent of quotas that reached full quota at time of delivery, trended by month

We can provide all the following information to our clients upon the request.

+1. Average qualifying or completion rate, trended by month
+2. Percent of paid completes rejected per month/project, trended by month
+3. Percent of members/accounts removed/quarantined, trended by month
+4.Percent of paid completes from 0-3 months tenure, trended by month
+5. Percent of paid completes from smartphones, trended by month
+6. Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month
+7. Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort)
+8. Average number of paid completes per member, trended by month (potentially by cohort)
+9. Active unique participants in the last 30 days
+10. Active unique 18-24 male participants in the last 30 days
+11. Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview
+12. Percent of quotas that reached full quota at time of delivery, trended by month