Brands and businesses could potentially enter a cycle of introducing new products based on predictive statistics; discarding the products we might enjoy.

Scientists, government policy makers, brands, anthropologists and behavioural psychologist are all using Big Data to guide their decision making. Sports predictions, election predictions, personalized advertising and streamlining traffic flow are a few examples of how insight affects daily life.

Smartphones and wearables are increasing the field of personal analytics, a fast growing market for the health industry potentially holding the digital key for various health care products. The capability to predict epidemics, cure disease, improve quality of life and avoid preventable deaths are a mere selection of potential uses in the medical sector. Marr, B. (2016) indicates that as the population increases and technology becomes more readily available, “models of treatment delivery are rapidly changing, and many of the decisions behind those changes are being driven by data.”

Businesses are driving towards analytics to guide them to create better personal experiences for their customers. In their essay User Control in the Age of Analytics, Tene and Polonetsky, (2013) both mention that analysing data “advances scientific research, transforming scientific methods from hypothesis-driven to data-driven discovery.” Brands are no longer using gut feeling and intuition to create products for their customers, data is now the driving force for decision making and product creation.(Carr, 2016)

Rapid growth and scale of data captured on-and-offline by organizations and social media platforms, allow businesses to understand the process and analyze patterns to understand the habit of their customers. By harnessing this technology to see detailed insight into their customer behaviours, the implications also become transparent. This essay explores the topic of implementing Big Data insights for personalized user experiences; its positive capabilities and the challenges it poses in allowing individuality and privacy. Can the usage of data driven apps and services allow customers to make their own decisions as more and more businesses are gearing towards the personalized method?

Personalization Nation

Working in the background on stealth mode, Google’s search results, Amazon’s recommendations and Facebook’s ‘People You May Know’ friends recommendations are using analytics to personalize experiences for billions of users. Opportunities and avenues are being created for advertisers and business to focus and direct their attention to the specific demographic that would be interested in their products.

The emerging wearables market has been able to document and capture the streams of data from users such as diet, distance walked, calories consumed and sleep patterns. Unlike traditional methods to motivate lifestyle changes such as public announcements and national health policies, personal focused intervention has been shown to be the most effective. Doherty et al.,(2013) writes that medical monitor technologies “have the unique potential to continuously and objectively monitor certain health-related behaviours and to analyse the information they collect to determine and immediately suggest alternative behaviours to the wearer.” This form of personalization has given birth to a new terminology “the quantified self”. The latests movement of the wearables market, the ‘quantified selfers’ as Levy (2013) explains “monitor their own biophysical, behavioral, and environmental markers in efforts to measure progress toward health and other goals”.

Netflix, the data driven subscription based online streaming business can map their customers experience the moment they sign up across all platforms. This data is then analyzed to not only recommend new programs but also generate new scripted programming. In an article discussing how Netflix produced the hit show House of Cards using insight, Sweeney (2014) writes that “Netflix crunches its subscriber base viewing data to identify fans of specific genres and then looks at TV formulas that it already knows are likely to appeal to them.” Far removed from the traditional TV network model, chance is no longer given to multi-million dollar investments that may or may not be successful.

Data is driving not only the customer experience but also the creative decisions as Carr (2016) notes on how analytics removes the “mysterious alchemy of finding a hit, Netflix seems to be making it look easy, or at least making it a product of logic and algorithms as opposed to tradition and instinct.”

Challenges in Personalization

The use of data for marketing purposes has been used by large supermarket chains via store reward cards since 1995. Duhigg (2012) states that supermarkets have been gathering not only customer buying habits but also their “demographic information including; age, marital status, dependents, house location, distance to the store, estimated salary, number of credit cards and the sites visited”. After data is purchased, supermarket analytics teams sift through the reams of information to understand buying behaviour and predict future buying patterns.

A major US retailer Target used records from baby registries and recent mothers past buying history to gain a new target audience. Analysing streams of data collected on purchasing items such as unscented lotion and vitamins, Target then used this newfound information to target recently pregnant customers then sending them personalized advertisements for baby products in the first months of pregnancy. This act could potentially cause emotional stress if the pregnancy is not successful or confusion as to how their pregnancy information was obtained by a large retailer; information their family and friends might not yet be privied to. Personalized attention in this form is a great departure from movie and book recommendations by an algorithm.

With a field of data to harvest from, it is easy to see how demographics that are not represented as what Lerman (2013) calls the “electronically harvestable” can be left out. The needs and voices of those not within the harvest can be underrepresented in Lerman’s (2013) “Big Data revolution” and be denied of government proposed services creating new forms of inequality and subservience.

With a lack of a digital footprint, businesses can start to undervalue a certain population of customers needs and preferences due to the inability to capture, aggregate and analyse their shopping habits. Lerman (2013) defines the inequality in certain communities where “stores may not open in their neighborhoods, denying them not just shopping options, but also employment opportunities; certain promotions may not be offered to them; new products may not be designed to meet their needs, or priced to meet their budgets.” The challenge for designers and analysts is to be able to find patterns to not only benefit those that have contributed to the data but also the population that might be left out from the collection.

Creating Homogeneous Humans

Computers see humans as numbers rather than individuals, a list of numerically coded facts churned out day after day and housed in server farms across the world. (Richards and King, 2013) writes that big data “seeks to identify, but it also threatens identity.” Records of our past surfing history, buying history, and social media history are only a glimpse of our personality and tastes. The question is not if we the people want this product, the question do we still need it or need it all the time? If historical data is being used to calculate behaviour, what would suggest that the need is now and not then or for whom?

Behavioural marketing is driving businesses towards the data-driven personalization model, but as Bollier, (2010) suggests “consumers have far less knowledge of what is going on, and have far less ability to respond”. Boiler (2010) also points out the case of the “‘my TiVo thinks I’m gay’ phenomenon” a response to the suggestion that if a user inputs a certain data, this could lead the algorithms to define their identity based solely on the input. This concept also falls in with as Zaslow (2016) terms “techno-profilers”, when data is so personalized and implemented that the idea of a single transaction ceases to be relevant.

Zaslow (2016) interviews a gay man who purchased a popular gay tv show series on Amazon and was inundated with gay-related calendars and book suggestions. Having then purchased a baby book for a pregnant friend, Amazon donned him a “a pregnant gay man”. The customer then proceeded to trick the data set to set new guidelines of personalization by inundating it with additional data. Zaslow writes how the gentleman “ searched for other stuff — on politics, computers — so it would stop throwing baby books” at him. The customer now suggests his current profile is a man who has abandoned his baby and is preparing for a career in politics. (Zaslow 2016)

If providing users with a visual representations of everything they might have enjoyed, didn’t enjoy and or tried; how can computers understand that this is not what defines humans, these are merely aspects of our lives.

In the future will the need to remain anonymous in our interests due to the fear of being profiled as a certain “type”, still allow us to freely engage with technology? Will we all conform to become the homogeneous human, sharing the same opinions, interest as we perceive our computer profile suggest we become or can future algorithms with the advances of AI make those predictions more accurate?

Brands and businesses could potentially enter a cycle of introducing new products based on predictive statistics; discarding the product that we mightenjoy. If analysis of data alone drives creativity for the production of new tv shows, or new items at the local supermarket, does the need for innovation cease to exist? Questioning the tactics of Netflix to create prefabricated content for it’s viewers Leonard (2016) asks “isn’t the inevitable result of this that the creative impulse gets channeled into a pre-built canal?”


Technology is constantly evolving and as the statistics from (2015) presents, “Smartphones have become the hub of our daily lives and are now in the pockets of two thirds (66%) of UK adults”. Smartphone users spend around 2 hours of their day on devices, which equates to 2 hours of data being generated and analyzed each day. As the era of internet of things, appliances, security systems and applications encroach into our homes and offices, will users be reduced to passive data points?

From a user’s perspective Big Data personalization has the potential to become a ‘new best friend’; recommending new shows, introducing us to our husbands and wives, driving our cars, finding new jobs or suggesting alternate driving routes. These so called ‘friends’ can at times also distrust us by divulging our secrets, gossiping to others behind our backs and use a variety of influencing tactics to alter our habits and behaviors. Matters are made worse when this so called ‘friend’ enter into our lives without notice or warning and outstay their welcome.

Insight implementation although helpful in many ways of daily life does not have an opt out button or a privacy setting; data is being collected everyday, all day. Tene, O. and Polonetsky, J. (2012) suggests that “As more information regarding individuals’ health, financials, location, electricity use, and online activity percolates, concerns arise regarding profiling, tracking, discrimination, exclusion, government surveillance, and loss of control.”

It is the responsibility of those who have the task to implement and understand the true meaning of how to extract the meaningful from the volume and variety of information given to them. Designers, data analysts and companies need to harvest the data and execute the correct context of application. In his report, Bollier (2010) quotes Kim Taipale of the Center for the Advance Studies in Science and Technology “is personalization something that is done to you or for you? Computers do not have the capacity yet to control our behaviour, they are merely a tool for business, governments and our personal use; either to enhance the human experience or turn us into consuming and seeding data drones that circle the digital landscape mindlessly following the orders of what the data suggest we do.


Bollier, D. (2010). The Promise and Peril of Big Data. Washington, DC: Aspen Institute, Communications and Society Program.

Carr, D. (2016). Giving Viewers What They Want. [online] The New York Times. Available at: http://house of cards purchased using big data [Accessed 24 Feb. 2013].

Doherty, A., Williamson, W., Hillsdon, M., Hodges, S., Foster, C. and Kelly, P. (2013). Influencing health-related behaviour with wearable cameras: strategies & ethical considerations. Proceedings of the 4th International SenseCam & Pervasive Imaging Conference, [online] pp.60–67. Available at: [Accessed 21 Feb. 2016].

Duhigg, C. (2012). How Companies Learn Your Secrets. [online] Available at: [Accessed 20 Feb. 2016].

Leonard, A. (2016). How Netflix is turning viewers into puppets. [online] Salon. Available at: [Accessed 26 Feb. 2016].

Lerman, J. (n.d.). Big Data and Its Exclusions. SSRN Electronic Journal.

Levy, K. (2013). RELATIONAL BIG DATA. 1st ed. [ebook] Future of Privacy Forum, pp.76–80. Available at: [Accessed 25 Feb. 2016].

Marr, B. (2016). How Big Data Is Changing Healthcare. [online] Available at: [Accessed 21 Apr. 2015]., (2015). The UK is now a smartphone society. [online] Available at:[Accessed 25 Feb. 2016].

Richards, N. and King, J. (2013). Three Paradoxes of Big Data. Stanford Law Review, [online] 66(41). Available at: [Accessed 25 Feb. 2016].

Sharma, A. (2016). ANALYSIS OF BIG DATA. International Journal of Computer Science and Mobile Computing, 3(9), pp.56–68.

Sweney, M. (2014). Netflix gathers detailed viewer data to guide its search for the next hit. [online] the Guardian. Available at: [Accessed 22 Feb. 2016].

Tene, O. and Polonetsky, J. (2013). Big Data for All: Privacy and User Control in the Age of Analytics. Nw. J. Tech. & Intell. Prop. 239, [online] 11(5). Available at: http://hp:// [Accessed 21 Feb. 2016].

ZASLOW, J. (2016). If TiVo Thinks You Are Gay,Here’s How to Set It Straight. [online] WSJ. Available at: [Accessed 26 Feb. 2016].