Guest Post: The Digital Transformation of Retail

By ShiSh Shridhar, WW Director of Business Intelligence – Retail Sector, Microsoft

You go shopping; let’s say it’s a national hardware store because you have a painting project you’ve decided to tackle this weekend. You have done your research online, chosen the paint and now you are at the store to pick up your supplies and get started. But, when you reach the section with the paintbrushes, you realize you’re not exactly sure what you need. You stand there for a moment trying to figure it out, and then you start looking around, hoping a sales associate will appear. And one does! She’s smiling, and she’s an expert in paints and yes, she can direct you to the brush you need — and reminds you to pick up some blue tape.

Shish 1

Shopping miracle?

No, shopping future. This kind of positive customer experience is one of the many ways that artificial intelligence (AI), sophisticated data gathering, and the cloud are being used to empower employees, bring consumers into stores, and shorten the path to purchase. These advancements help brick and mortar retailers compete with online retailers in today’s world. It’s the digital transformation of retail, and it’s happening now in ways big and small.

AI + Data = Retail Revolution

This transformation is driven by data, which today can come from any number of sources. In this example (a product called Floor Sense from Microsoft partner company Mindtree), the data is collected from security cameras already in place throughout the store. The cameras capture footage of how people move through space, where they stop and what they do as they shop. The video feed is then analyzed using AI that has been trained to understand how a customer acts when he or she needs help. When that behavior is recognized, a sales associate with the right expertise is sent to talk to the customer and help the customer make a decision.

But a store’s proprietary data is only the tip of the iceberg. Today, there are millions of data points that are either publicly available or easy to purchase from companies like Experian and Acxiom. Retailers can combine that demographic data with their existing CRM data to model behavior and build micro-segmentations of their customer base. Insights from that narrow analysis allow retailers to personalize, predict and incentivize in ways that are far more accurate than ever before.

Putting data insights into the hands of employees

Already, that kind of analysis has helped make online shopping more productive with relevant, timely offers. The next step for retailers is to learn how to make data-driven insights useful to store employees, as in the hardware store example, so they can enhance the customer’s in-store experience. The data could come from a customer’s interactions with the retailer’s app, chat bots, social media, in-store beacons or Wi-Fi, all of which, when compiled, allows a retailer to make extremely accurate inferences about a given customer’s behavior.

Managed well, those insights help a store employee serve a customer better. Managed poorly, personalized targeting in-store has the potential to spook customers. To handle it well, retailers must do two things: First, any in-store tracking should be done through a consumer opt-in, with transparency about how the retailer will use the information. Second, the customer deserves a good value exchange; it must be clear to her how she is benefitting from sharing her information with the retailer, and how her information contributes to delivering her a frictionless shopping experience.

Using a customer’s digital exhaust to everyone’s benefit

As consumers explore purchasing options and develop their preferences using search tools, social media, apps, and in-store visits with a device in hand they leave behind a digital exhaust. Today, advances in AI, data aggregation, and the cloud allow retailers to collect that digital exhaust to generate a style profile of prospective customers, which can then be used to introduce those customers to other products they might like. In this so-called phygital world — where the physical and digital overlap — retailers can combine data from multiple places to make inferences that will help them sharpen their marketing approach. The techniques are at hand — now, it’s up to creative retailers to find innovative ways to use those insights to inspire their customers, and shorten the path to purchase.

This article was originally posted on Independent Retailer.

Marketing Attribution Preparedness

As a digital marketer, you have almost endless data at your fingertips. If impactful marketing attribution is your objective, you don’t need it all – just the right data: User Identifiers, Timestamps, Media Data, and Channel Data. Gather (and govern!) these four types of data to allow your attribution efforts to tell your story effectively.
-Alison Latimer Lohse, Co-Founder and Chief Strategy Officer, Conversion Logic

On August 1st, I had the pleasure of hosting Alison Latimer Lohse at eSage Group’s monthly L.A. Marketing Analytics Group event in Santa Monica, CA.  Alison took us through the full continuum of marketing attribution from:

Why Do Attribution: The marketing ecosystem is comprised of billions of different possible permutations of target audience touchpoints due to available marketing channels and tactics. Following the customer journey across this vast ecosystem to the point of conversion is challenging.

Attribution Determining Causality: Marketing attribution is the equivalent of determining causality. Conversion Logic focuses on advanced attribution: holistically measures points across all media stimuli in order to give credit where credit is due.

Marketing Channels: On an average over 90% of marketers said they used three or more channels currently, and a significant 39% said they will use more channels in 2 years. 6 on an average, and the number of available channels is on the rise. The good news: the trend is towards portability and accountability.

Marketer’s Questions to Consider: Credit where credit is due, Cross-channel impact, Identify waste and opportunities for scale, Find customers more efficiently, and Simulate investment strategy

Available Methodologies:

  • Last Touch: 100% credit to the last touchpoint leading to conversion
  • Rules/Heuristics: Rules used to allocate credit to each channel for conversion based on sequence, weighting etc.
  • Vendor Reporting: Publisher provides metrics on how media performed
  • Statistical Model: Statistical approach to allocate credit to each channel / touch-point (linear or logistic regression)
  • Machine Learning Predictive Ensemble: Ensemble methods use multiple algorithms to obtain better predictive performance.

Approaches to Marketing Measurement:

  • Media Mix Modeling (MMM)
  • Cross-Channel Modeling
  • User Level Attribution

Assess Attribution Readiness:

  • Identity: Know your objectives and align across the business
  • Alignment: Executive Buy In, Cross functional Champions, Ownership
  • Organize: Be conscious of how your mar-tech systems work together, make sure you can track all conversion data, create consistent taxonomy and data structure

Alison was only scratching the surface of the available innovations and successes of a well-thought marketing attribution plan. One of the key takeaways for me was the part of her talk that covered Attribution Readiness and the need for businesses to take an honest look at their available data around marketing channels to assess whether or not they are ready to launch into attribution. Some of the important assessment tests that were talked about, include a checklist of items to be aware of.  I think they are important enough to leave you with them as parting food for thought. Enjoy!

Attribution Evaluation Checklist

 

Methodology
– Digital Fractional Attribution Methodology
– Cross-Channel Modelling (Online & Offline)
– Modelling Transparency and Validation

Data
– Data Alignment & Reliability
– On-boarding and ETL Support

Partner
– Customer Service Level
– Implementation Time
– Partnership Cost

Product
– Side-by-Side Comparison of Last Click & Attributed Value
– Path to Conversion Analysis
– Detailed Optimization Insights & Recommendations
– Mobile Insights
– Customizable, Granular and Actionable Front-End

The Great Data Migration – Part One

June - A Field Guide - Blog Banner.pngI don’t care who doubts our ability to succeed, as long as it’s nobody on my team.
– Kobe Bryant, Los Angeles Lakers Guard

Prepare For Takeoff

Everyone, these days, is jettisoning on premise storage and sending their data to the cloud. The reasons are varied, but generally come down to two factors: time and cost. Cloud storage, from any of the major providers like Amazon or Microsoft, can cost less than $0.02 per GB per month. Compare that to Apple’s revolutionary magnetic hard drive that debuted in 1981. It had 5MB of storage and cost $3,500, which is over $700,000 per GB. Ok, there was no monthly fee, but I digress. 😉 Time is usually how long it takes to get a new server, file share, or document repository installed in your corporate headquarters vs simply storing new data in Amazon S3 or Microsoft Azure. Or perhaps it’s the amount of IT resources that are needed to keep depreciating and out classed data centers up and running.

There are many advantages to cloud storage, which won’t be rehashed here. If you need a refresher (or convincing), this site may come in handy: http://www.enterprisestorageforum.com/storage-services/cloud-storage-vs.-on-premise-11-reasons-to-choose-the-public-cloud-1.html

For the moment, let’s assume that you have decided to move your data to the cloud. This article will help you decide where to move it, how best to do so, and an ideal way to keep it updated and fresh.

 Where To Go?

In today’s cloud landscape, there are two players: Amazon and Microsoft. There are others, such as Google, but Amazon Web Services (AWS) and Microsoft Azure hold the keys. In addition to storage, they both offer services such as Virtual Machines, Caching, Load Balancing, REST interfaces, Web hosting and more, which can handle your other applications, should you need to migrate them to the cloud in the future. There are pros and cons to each, but both will handle your data securely, provide timely and cost effective access, and transparently maintain ready to use backups in case of unforeseen events. Let’s break them both down:

blog - aws s3AWS S3 (Simple Storage Service) is, as the name states, pretty simple. It has a free tier with 5GB of data and then breaks down into 3 categories – Standard, Infrequent Access (IA), and Glacier. If you just need to stash old data in the cloud and have no idea how it will be used in the future, use IA or Glacier for extremely cheap storage. Glacier is only $0.004 per GB vs Standard at $0.023 per GB per month (US West Region). The trade of with Glacier and IA is that it takes a little longer to get at the data you want to use, anywhere from a few minutes to several hours. Data can be moved up down from Standard, IA, and Glacier tiers so, for instance, those old application logs that no one was using can quickly be made available for reporting when needed.

Standard storage is what most people use for quick access to data. For the first 50TB per month, the price is $0.023 per month (US West Region). Anything can be stored here, such as images, binary files, text data, et. AWS S3 uses “buckets” to contain data, which can have an unlimited number of object. Each object within a bucket is limited to 5TB in size. For a breakdown on AWS S3 pricing, go here: https://aws.amazon.com/s3/pricing/.

We’ll discuss how to migrate data to S3 a bit later. For now, know that access to your S3 data is through the AWS web console and a variety of APIs, such as the AWS S3 API, the s3:// or s3n:// file protocols, et. AWS S3 is secure by default, with only the bucket / object creators having initial access to the data. Permission is granted via IAM roles, secret / access keys, and other methods that are out of scope for today. A good primer for S3, including security, can be found at the S3 FAQ: http://aws.amazon.com/s3/faqs/.

blog - azure storageAzure Storage has a lot more options that AWS S3. This can be confusing at first but also offers the most in terms of flexibility and redundancy for your data. There is a small free tier, and as a counterpart to AWS Glacier, Azure Storage offers “Azure Cool Blob Storage” for your archival, compliance, or other “don’t use but can’t throw away” data. Prices are usually less than $0.01 per GB per month in some regions.

Unlike S3, Azure Storage comes in several flavors of redundancy, so one can choose how many backups of their data exist and how easily they are accessed. If you have easily replaceable data, say from a 3rd party API or data source, then choose the cheaper LRS (Locally Redundant Storage) option which will keep local copies of your data in a single Azure data center. Need a durable, always available, “a crater can hit my data center yet I’m still OK” option? Then RA-GRS (Read-Access Geographically Redundant Storage) is the preferred option. This will ensure that copies of your data are also maintained at a second data center hundreds of miles away, yet always available for easy access. Middle ground hot and cool options exist as well. For a breakdown of Azure Storage pricing, please visit here: http://azure.microsoft.com/en-us/pricing/details/storage/blobs/.

Note: AWS S3 is functionally equivalent to Azure Storage GRS (Geographically Redundant Storage), so use this option when comparing prices.

Azure Storage uses “containers”, instead of buckets like S3, and containers can contain blobs, tables, or queues (discussed below). There is no limit to object size, yet each container can “only” hold 500TB. However, there can be multiple containers per storage account. Access to data is through the Azure Portal, Azure Storage Explorer, PowerShell, the Azure CLI (command line interface), and other APIs and file protocols.

Aside from regular blobs, there are a couple different types of data that can be stored in containers: tables and queues. Think of both as convenient layers laid over the raw data, to ease read and write access for different applications and scenarios.

 blog - azure storage tablesAn Azure Storage (AST) is essentially a NoSQL key-value store. If you’re not sure what that is, then you likely don’t need it. 🙂 NoSQL data stores support massive scalability yet the dataset and server sharding that is normally necessary (and a headache) for this is handled for you. AST, like other NoSQL datasets, supports a flexible schema model which allows one to keep customer data, application logs, web logs, and more – all with different schemas – in the same table. Learn more here: http://azure.microsoft.com/en-us/services/storage/tables/.

blog - azure Storage QueueAzure Storage Queue (ASQ) provides cloud-based messaging between application components. Having a central messaging queue is critical for different applications and parts of applications that are often decoupled and need to scale independently of one another. This would only likely be needed if you have applications which currently store message data on-premises, but needs to be migrated to the cloud. The size of each message is limited to 64K, but there can be an almost unlimited number of messages (up to the container limit). Learn more here: https://azure.microsoft.com/en-us/services/storage/queues/.

Another option unique to Azure in general, is the ability to link your corporate Windows Active Directory or Office 365 Active Directory with the Azure Active Directory (Azure AD). This feature, called Azure AD Connect, allows SSO (single sign on) between on-prem and cloud based applications and services. It is easy, for example, to quickly setup permissions and roles that manage access to essential services and storage across your organization.

For a rundown on security and encryption options, please visit: http://docs.microsoft.com/en-us/azure/storage/storage-security-guide.

This is great for raw storage, but now what about my DATABASE?

Almost every organization has relational data. While this can be extracted and placed into raw storage, it’s often easier to just lift it entirely into the cloud and go from there. Both Amazon and Azure platforms support numerous relational database hosting options, from Azure’s SQL Database to AWS’s Relational Database Service and many options in between. We’ll look at some of them here:

blog - aws rds.pngAWS Relational Database Service (RDS) allows easily deploying and scaling relational databases in the cloud. It frees one from the hassle of managing servers, patching, clustering and other IT heavy tasks. Also, it supports six different flavors of database: Amazon Aurora, PostgreSQL, MySQL, MariaDB, Oracle, and Microsoft SQL Server. One unique option is the ability to offload read traffic to one or more “Read Replicas” and thus increase availability and performance of your primary database instance. Database security differs depending upon your database flavor (some support encryption at rest, et) but the AWS RDS itself can be secured by being deployed within an organization’s AWS VPC (Virtual Private Cloud). In my opinion, AWS as a service has a simpler approach to security than Azure, because more AWS services can be setup behind the VPC, which acts as a gateway to sensitive data and applications. Learn more about AWS RDS here: http://aws.amazon.com/rds/.

blog - aws redshift.pngAWS Redshift is Amazon’s data warehouse in the sky. Essentially a supersized PostgreSQL, it provides scalable, cost-effective SQL based storage that includes queries that can run both on S3 (via Redshift Spectrum) and Redshift. It stores data in a columnar-based fashion, giving fast query times over massive amounts of data. It might be overkill if your dataset is small, but if you have petabytes (or exabytes) of structured data that need analyzing quickly, Redshift can likely handle it. Start with AWS Redshift’s home page here: http://aws.amazon.com/redshift.

blog - aws athenaAWS Athena is a new service which attempts to blend the raw and relational data worlds together. Simply point it at an S3 bucket, define a schema, write your SQL query, and go. You only pay for the queries run on the raw data and the schema definition can be reused with other queries, modified for another run, or simply tossed away when finished. Also, Athena can turn around and store the results back into AWS S3 or be used by another workflow to furnish data. By not having a permanent relational layer, data workflows and ETLs have less steps and less points of failure. Learn more about AWS Athena here: http://aws.amazon.com/athena/.

blog - azure SQLAzure SQL Database is as simple as it gets: An SQL Server database as a service. No Virtual Machines or licenses to manage, no patching, lots of redundancy, and fast performance. One can pay per database or in “elastic pool database units” (EPDUs) which can be spread resources across many databases. Azure SQL database is meant for small to medium size databases (up to 50GB) that can operate independently from one another, as in a multi-tenant application or reporting solution. If you have a lot of SQL data to migrate, it is a good idea to break it up and store it along date or business lines or make the jump to Azure SQL Data Warehouse, a larger service meant for enterprise intentions (see below). Connections are made through a standard ODBC / JDBC connection string, with the URI as the service endpoint for your database.

Keep in mind, that this is not the same as full SQL Server. Since there is no real “server” involved, most system stored procedures (DBCC, et) and distributed transactions won’t work and SQL Server peripheral services, such as SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) are not included. These voids can be filled by other services in the Azure stack; however, or by running a full copy of SQL Server in an Azure Virtual Machine (see below). Although a look at Azure Analytics is out of scope here, you should know that Azure supports an entire range of analytical services which can consume data from Azure SQL databases. Learn more about Azure SQL Database here: http://azure.microsoft.com/en-us/services/sql-database.

blog - azure SQL Data WarehouseAzure SQL Data Warehouse (ASDW) is for enterprise grade data warehouse deployments. Like AWS Redshift, one can scale compute and storage independently and prioritize analytical workloads or long term storage. It also lets you pause the compute side entirely, turning ASDW into an archival data store when analytics aren’t needed. It also leverages Microsoft’s PolyBase service, which allows queries to execute against other data sources, such as Azure Data Lake, Azure HDInsight, or even another on-prem SQL Server data warehouse. Unlike Azure SQL database, Azure SQL data warehouse stores data in a columnar-based format, for maximum performance and scalability. Learn more about the Azure SQL Data Warehouse here – http://azure.microsoft.com/en-us/services/sql-data-warehouse.

 Rolling Your Own – AWS and Azure Virtual Machines

Of course, if you need advanced SQL Server features, such as SQL Server Analysis Services (SSAS), or want to run a completely different database type, you can always spin up a Virtual Machine (VM) and install the relational database software there. Many VM images from both AWS and Azure come with SQL Server, Oracle, or other software preinstalled, so all you need is your licensing information. Also, some images include the cost of the database software, effectively renting the license to users for a monthly fee. This can be useful, for example, if you would like to try out the features of SQL Server Enterprise before making a full purchase. Virtual Machines are also useful when ETLs and data workflows need to also be migrated to the cloud, as the VM can simply host the software required to run it.

NOTE: When you go the VM route, you are usually responsible for hardware provisioning/formatting, software patches, service upgrades, and maintaining secure access (through firewall rules, etc) to your system. A good pro/con for evaluating Azure SQL Database vs SQL Server on Azure VMs can be found here: http://docs.microsoft.com/en-us/azure/sql-database/sql-database-paas-vs-sql-server-iaas.

Now, how do I get it up there?

Alright, you’ve chosen your cloud platform, you know what data to move… or do you? How do you prioritize what goes and what stays? Stay tuned for The Great Migration – Part II, where I’ll cover next steps in how to lift your data into the cloud.

Hope this helps and happy migrating! Feel free to email me at jasonc@esagegroup.com with any additional questions!

Sincerely,  J’son 

 

Disrupting Hollywood Paradigms with Analytics

At the April 4th L.A. Marketing Analytics Group, Matt Marolda, Chief Analytics Office at Legendary Entertainment presented on Disrupting Hollywood Paradigms with Analytics. We were treated with some very interesting and thought provoking innovations that Matt and his team at Legendary Entertainment are working on to uncover powerful insights about mainstream box office audiences in the US and worldwide. This insightful information helps inform marketing tactics and overall production investment strategies.

digital film banner

On April 4th, we had the pleasure of hosting Matt Marolda, Chief Analytics Officer with Legendary Entertainment at the monthly L.A. Marketing Analytics Group.  The event is hosted by eSage Group.

At our event, Matt presented on Disrupting Hollywood Paradigms with Analytics.  We were treated with some very interesting and thought provoking innovations that Matt and his team at Legendary Entertainment are working on to uncover powerful insights about mainstream box office audiences in the US and worldwide.  This insightful information helps inform marketing tactics and overall production investment strategies.

side bar for mayMatt and his Applied Analytics team are tasked with informing several key components of the movie making and marketing process.

The following is a list of the key discussion points and learnings from Matt’s presentation.

  • Informing Creative: Evaluate movie concepts, cast, themes and fan base long before a single dollar is spent on movie production.
  • Transforming Marketing: Use analytics to identify, understand, reach, and persuade individuals to watch a particular movie.
  • Understanding People, Content, Social and Conversation: Create a virtuous feedback loop where these four inputs are integrated into the overall marketing process to provide a continuously improving understanding of your audience over time.
  • Identify Varying Degrees of Persuadable People: Identify three clusters of persuadable personas with varying degrees of predictability with regards to convincing them to attend a particular movie.
  • Innovative Experimentation that Yields Big Wins: Movies that aren’t positively received in one part of the world may perform well in other regions. Use experimentation to test market acceptance of a particular movie in other regions before making further marketing investments.

Read on to hear more about how Matt and his team are shaking apart the old Hollywood Paradigms and creating a truly data-driven movie making environment at Legendary.


Full Article


Olegendary grey logo.pngn April 4th, at the monthly L.A. Marketing Analytics Group which is hosted by eSage Group, we had the pleasure of hosting Matt Marolda, Chief Analytics Officer at Legendary Entertainment. At the event, Matt presented on Disrupting Hollywood Paradigms with Analytics and we were treated with some very interesting and thought provoking innovations that Matt and his team are working on at Legendary to uncover insights about the mainstream box office audience in the US and worldwide.

It is known that Hollywood has many marketing paradigms that have been well entrenched for decades, some good, others beg to be disrupted. One of those stale marketing tenets that are being reevaluated and shifted to something more useful is how Hollywood has been segmenting the market using the often cited four quadrants: by male, female, over 25, and under 25; totaling 4 groups of 80 million people in the US, which is probably one of the crudest ways to look at the marketplace. Legendary is changing this old practice by looking at ways to see an audience in a much more granular way by micro segmenting this model into 80 million groups of 4, give or take – which seems daunting. But if you consider the top down leadership effort at Legendary to drive a culture of data driven marketing, it becomes more realistic. Two key efforts at Legendary that Matt and his team have been driving include Informing Creative and Transforming Marketing through applied analytics.

Informing Creative: All movie studios must decide on what creative movie concept, cast, and theme should be green-lighted, and which fan base should be targeted long before a single dollar is spent on making the movie. For much of the entertainment industry’s history scripts have been written and submitted, discussed and debated, picked up and dropped. This process includes top level executives deciding on whether the movie is going to be viable, profitable, and on brand for the studio.

Many of those decisions in the past were made by gut instinct, and then hope, prayer, chanting and the burning of incense was involved (at least in my version of this). With the advent of data collection and analytics being injected into the creative decision process, decision makers are shifting to a more data-driven approach when deciding to make a film. Present day studio brass at Legendary are now utilizing analytics (by way of Matt’s Applied Analytics Team) to evaluate movie concepts, the cast, screen fans ahead of time, test the theme, and they even look at sequel viability in order to inform the creative process and uncover the potential ROI in a very predictive way. This includes looking deeply at People, Content, Social and Conversation which I’ll talk more about in a bit. Legendary Entertainment CEO and visionary, Thomas Tull sought a better way when he launched the company in 2000.  As of 2017 Tull is no longer CEO, but his decision to hire Matt Marolda in 2013 to take Money Ball to Hollywood remains firmly intact.

Transform Marketing: Legendary shifted to an internal analytics team tasked with altering the marketing process from an older model sadly referred to as spray and pray where marketing dollars are spent in volume to reach all people with the hope that they will show up at box office. The paradigm shift is towards a more targeted approach that seeks to identify, understand, reach, and persuade individuals. Beyond that goal is the effort to reduce marketing spend by 20% through the strategic targeting of specific audience segments. That reduction in marketing spend directly affects the bottom line of movies on a global scale. Legendary is doing this is by maintaining an analytics team comprised of three groups: The Quantitative Group, The Development Group and the Delivery Group.

It’s the Quantitative team’s job to tap tons of data and apply it to the movie making and marketing processes directly using analytics. The Development team is tasked with integrating complex data feeds rapidly, turning the models of the Quant team into software, building high speed querying to allow iteration and refinement, and launching media feeds through API’s.  Finally, it is the Delivery team’s job to execute media buys on all digital platforms, where for Legendary, media agencies were unable to deliver at the speed and scale that is required to be competitive in this fast-moving effort to turn insights into marketing action that targets segments uncovered earlier in this process.

Successful marketing analytics teams everywhere strive to experiment, learn, “fail” fast, and reiterate the ongoing process in an unending feedback loop. This agile approach to data driven marketing continues to prove that trying things that might have failed in the past, or those that weren’t ever tried, can still yield big wins. This “panning for gold” approach either yields an answer confirming that one consumer messaging approach or another did not work, or in other cases it is uncovering novel and successful messaging that would not have been conceived of otherwise. Therein lies the paradigm shift from overspending and gut instinct marketing on a grand scale, to the highly targeted and strategic approach that not only saves marketing dollars, but imparts analytics that allow organizations to make their current marketing spend far more productive.

side bar for mayPeople, Content, Social & Conversation Looking at how competitive and successful movie companies like Legendary Entertainment approach data collection and the insights that are surfaced is a great example of how to stay competitive in any marketplace. This innovative approach to analytics is apparent in Matt’s Applied Analytics team who uses a virtuous cycle where four specific areas of data are integrated in unique ways. They collect and analyze information on People, Social, Content, and Conversation.

 

  • People Information: Includes 1.5 billion email addresses, 200 million households with PII (Personally Identifiable information) and hundreds of other attributes per person sourced from a multitude of available data. The Applied Analytics team at Legendary has curated this data set internally to enable them to identify, reach, understand, and persuade people.
  • Content Information: Includes box office data by individual theater for all movies from 2007, all US based advertising since 2007, meta data by second for movies from the 80’s to present day that identify actors in each scene, topic of conversation, tone & tenor of background music. These pieces of information are all crucial for comp’ing and modeling in the property evaluation process – which determines properties to buy, which movies to make, and which actors will be in them.
  • Social Media Information: Includes feeds from 500 million Twitter profiles and billions of tweets, 100 million Facebook profiles, all of Reddit, all of Wikipedia, plus thousands of News Sites and Blogs are scoured, collected and stored for insights.
  • Conversation Information: Includes social interactions by geography, plus extensive analysis on text and images using traditional and innovative techniques to help inform movie makers and marketers. This data helps Legendary understand current trends, what people are saying about particular movies along with their associated sentiment.

Using tools to analyze People, Content, Social and Conversation have helped Legendary build audience profiles and create hundreds of microsegments, identify key persuadable points and produce detailed, actionable insights. Acting on this data by matching profiles to key aspects of a movie surfaced from movie meta data informs Legendary on what movie to promote and to which persuadable audience. Changes of intent are also measured and creative actionable insights emerge at a rate of over 50 per movie, per week. This analytics ecosystem which integrates data, analytics, and campaigns, along with deep API integration allows for scale execution, sophisticated reporting and continual optimization.

Varying Degrees of Persuadable People: When Trying to identify persuadable sets of personas, Legendary looks at three clusters of people who hold varying degrees of predictability with respect to the ability to convince them to attend a particular movie.

First, there are folks that have been engaging with a property since childhood. Movies like Godzilla, King Kong and Warcraft have a historically long-standing culture of followers. Legendary seeks to turn those people into active evangelists. Marketing to them does not require a high-dollar spend because of their inherent affinity for the subject matter. Simply advertising key points like hints at trailers to be dropped and the dates of movie release are all that this group needs. The marketing effort is just to “stoke the flame” for this group.

Another cluster of people is identified as the “your mom” segment.  With this segment Legendary understands that there are just certain movies that they will not attend.  So there is no need to spend marketing dollars on this group when marketing a certain genre of movies.  Legendary will save these marketing dollars for other groups that are more apt to engage.

The third identifiable cluster is defined as the “people in the middle.” This group, when served the right creative at the right time will change their mind and their opinion of the film. In May of 2014 Legendary ran some testing on the Godzilla release. They found, unexpectedly that women ages 24 to 36 were identified as persuadable. This was a key insight. Legendary didn’t think this segment would be a viable target when they were whiteboarding their marketing effort. They then used analytics to design the movie trailer through insights uncovered about this new segment. The insights uncovered showed them that they needed to emphasize the conspiracy theory in the Godzilla plot rather than the monster destroying the city. They also knew to emphasize Bryan Cranston who was fresh off Breaking Bad. The trailer they created resonated with women and they subsequently launched an extensive marketing campaign around this knowledge, while continuing to further optimize their marketing for these segments.  Uncovering surprising and unexplainable segments like this has proven to be extremely valuable.  Establishing a culture of experimentation and building feedback loops into your strategy allows for these kinds of powerful insights.  Once a new segment is identified, predictive models are built to identify who is likely to engage with particular content.  With propensity scores, Legendary can zero in on targets as they get closer to the release, whereas other studios tend to get more panicked and market to a broader audience.  Legendary can go narrower, targeting only the specific people they think will act to a particular marketing piece.  When thinking of the world on a more CPM basis this is a better way to go.  Plus, this can all be done at scale.  This same process works for blockbuster films or much smaller projects, with both receiving lift.  With this lather, rinse and repeat methodology, these efforts all continue to inform and allow for continual optimization. Data is collected on an ongoing basis and appended back into the database, so the system continuously learns what approaches work and which don’t.  The system is continuously getting smarter and providing ever greater ROI over time.

A Third Paradigm Shift: A third and final example of how Legendary is disrupting the old Hollywood paradigms can be seen in their effort around the release of Warcraft. The movie was tested in China, with data being collected on 8 different versions of the movie with hundreds of test subjects using biometric Fitbit-like devices to track heart rate, blood pressure, oxygen levels, and more. Also, iPad-like devices were used to capture facial expressions during the viewings.  These experiments provided information on how viewers were engaged with the movie in several ways.

This data ultimately revealed that a movie that was “panned” by the critics and dubbed awful in the United States would in fact resonate well with the Chinese audience.  The movie was launched globally and was a box office success in China grossing over $220 million.  This was a record in this geo and the movie ended up with total global revenue of over $433 million.

In Closing:  The analytics concepts being harnessed by the team at Legendary are innovative and cutting edge and might just prove to Hollywood once and for all that analytics are here to stay and not just a passing fad as some have said (as they were packing up their belongings and seeking a new line of work).  Those of us who seek deeper knowledge of our customers and processes in business know that if you aren’t competing with analytics, you’re not competition.

Many of the efforts at Legendary are reminiscent of the work we do at eSage Group. For large entertainment companies and several other verticals, we too help our customers obtain deeper insights through data integration and analytics. If your marketing organization is looking to migrate from the Spray and Pray digital marketing approach to a modern and efficient Marketing BI/Analytics infrastructure, we at eSage Group can assist in that process in a myriad of ways.  We help our clients integrate data from a multitude of various sales, marketing, partner and social channels, no matter the format of the data or the system it’s sourced from.  We design and build Marketing Analytics Data Warehouses, Data Lakes, Data Marts and connect newly integrated data to all your favorite reporting and analytics tools.  We are equally comfortable with on-prem and cloud based traditional data warehouse technologies such as Microsoft SQL Server, as well as most of the modern Big Data/Hadoop based platforms.

Our goal at eSage Group is to assure that marketing teams are obtaining useful and highly actionable insights from all internal and externally collected sales and marketing data. For more information on eSage Group and how we help our clients, please visit our website or email me directly.

Variety Big Data Summit 2016 and the Ubiquity of Data in all Things

Takeaways from the Variety Big Data Summit 2016

By Rob Lawrence – Southern California Business Development Manager, eSage Group

bigdata_web_header_2016_v3

Attending the Variety Big Data Summit for three years in a row now has always been a rewarding and insightful endeavor for me. My understanding of the data landscape in entertainment, and really business in general, has grown exponentially over the years I’ve worked in business development for a data integration and business intelligence consultancy. What I’ve come to realize over the years is that I’ve witnessed the rise of a concept that really had already existed for a very long time: Big Data. I’ve seen the widespread adoption and use of Big Data (or at least the hope to be using it someday, in some cases). Reflecting on the things I learned at this last Summit earlier this month, it occurred to me that the ubiquity of Big Data is here, finally! Or has it always been here? OK, let’s explore what I mean.

Taking a look at the topics discussed at the day-long event is reminiscent of any collegiate marketing or business major’s syllabus for a given semester: “Explore data’s growing impact on media and entertainment spanning content creation, audience measurement, monetization, marketing, asset management and more.” Subtract the overall look at entertainment specifically, and what you have here is a study in data (information) for the purpose of creating, measuring, monetizing, marketing, and managing your business. Whatever business you are in, you either do these things better than your competition, or you fail. The deeper dive panel discussions at Summit were a good reminder that data is challenging the way business has been done in the past. We looked at important topics such as Reaching a Global Audience, Engaging Relevant Audiences, People-Based Marketing, Sports & Data, Privacy & Security, The New Video Economy, Data Science, and Data & Creativity. All topics that have profound effect on the entertainment and media business, and most should be equally as important to non-entertainment verticals. Data is ubiquitous and we’ve been collecting it for years. The sense I have is that businesses are just now beginning to realize the benefits of what we’ve been talking about since the inception of the Big Data movement, and going forward this journey will never really end.

One of the panel discussions at Summit struck me as a very important commentary on the benefits of using data to drive demand, engagement, and understanding of the customer journey: The Rise of People-Based Marketing. This topic is new only in the sense that we are beginning to bring all of the pieces of the puzzle together in marketing, and some new pieces of that puzzle have emerged. I liked what Sean Moran, Head of Marketing & Partner Solutions, at Viacom had to say regarding this: “It’s cliché to say that we’re in a more evolved state in marketing than ever in our history. We’ve talked about it for a long time, but before now it was never actualized. Such rapid exchange and pace of what data can be captured during a behavior revolution. It’s all coming together. We’re connecting in a targeted way in which the ability to get targeting and predictability at scale in TV and extending it to other platforms, so you can predict the consumer’s behavior by fusing data together through viewing, geo-location as well as purchase intent. Emotional connection plays into this as well. The excellence of data can only take you so far, you have to understand the emotional connection of consumers. Some are jumping on that, others see data as a way of sourcing up the currency that’s been happening since 50 years ago.” So, in theory, the ability to collect data and the ever increasing data points that are collectible are truly giving companies the opportunity to tap into the consumers’ emotional state at the most critical points along the customer journey. We might predict what the consumer is feeling before they realize what they are feeling. However, and to counterpoint this concept, I also liked what Andrew Appel, President and CEO at IRI had to say with regard to the fragmentation of all these data points: “Yet nobody really has the capability to do that (use the data with a level of sophistication), so while the fragmentation is exploding, the ability to get all those different data sets into one place (shopper data, media data, consumer data, context data, purchase intent, television viewing habits, exposure to advertising, etc.) at an individual level to effectively score each and every consumer in real time, on what will drive their behavior change, what you should use to target them, and ultimately measure whether or not it changed their behavior, is ever more difficult.” While I agree with Andrew that it is ever more difficult, I also believe companies are getting savvier at doing these things, and thus the journey to marketing analytics nirvana continues.

It wouldn’t be right for me to finish my post without talking about Mobile. Mobile in the sense that everyone is connected, everywhere they go, regardless of the device at the crux of the matter. This represents a very intriguing opportunity for businesses, and I suspect it will only continue to grow ever more intriguing as we get better at things like attribution, segmentation and personalization, and as technologies like VR begin to take hold. On that, I agree with Eric Smith, Industry Manager, US Entertainment, at Facebook, who said: “Facebook has embraced mobile for consumers. For consumers it is known that 1 out of 5 Mobile minutes spent is either on Facebook or Instagram. Facebook has a very personal connection with people as it creates 1.8 billion personalized experiences for people every month given their active user base. Mobile is one of the key doorways to personalization. These are big numbers, but that is how you get to scale. The scale of personalization. A couple of things Facebook has worked hard at are; a strategy that they’ve built for their Partners called Test Learn and Act, where we understand who was interested in their content or the game. A great example is with Ubisoft on a game they are launching in the Tom Clancy series. We understood with them that there are three really important segments within their user base (Why people want to purchase and play their games): there’s a group that’s really interested in the technology and strategy and gadgets and gizmos, there’s a group that is really into the adrenaline and competition, and there’s a group that’s just really fascinated by the open world play where they can go and explore a universe that is unlike the one we live in in real-time. So Facebook worked with Ubisoft to understand these audiences and then to build a creative doorway; a 5-second bit of content that they laid on top of the trailer for their game that really appealed to those segments. We showed a lot of the tech, gadgets and gizmos to that audience, and so forth. We saw 63% increase in Purchase Intent, amongst those audiences by properly segmenting and then speaking to them in the right creative language – we’ve applied that to movies as well. Sony (where Eric used to work) is now a client of Facebook’s. Working together on the Sony film: Money Monster, using this same strategy, they found that men 25 to 34, an audience they hadn’t expected to be interested, but very much was. So Facebook worked with Sony again to help create some creative that spoke to that audience, targeting clusters that would reach those men, 25 to 34. As a result, they found in the exit poles there was a strong demographic of them that actually went to the film. This is the Intersection of the Content and Mobile Marketing Personalization experience.” Well said, Eric. None of which could be done without the proper collection of data, not just from mobile, but from all of the other myriad places we are able to pull from, and those are only growing exponentially with new platforms, technologies, and ways to consume content. Each new form is representative of a whole new world of available data points around the customer journey: The Ubiquity of Data in All Things!

In conclusion I’d like to say that the Variety Big Data Summit 2016 was a relevant and successful event, as it has been in past years. I’d recommend anyone working in Entertainment, or in a business that is striving to be more Data Driven to attend. I’d also say that the attendees at this event each year prove to be highly engaged, many thought leaders in their own right, and thus the networking is a blast. From a “Big Data” perspective, it is clear to me that the use of data for all things in business has a very broad surface, which has only been slightly scratched. It’s an exciting time for marketers, content creators, and businesses who have honed a skill around experimenting with, and utilizing data in new and innovative ways. Now, can we get back to referring to the practice of collecting and harnessing data as something other than “Big Data”? Perhaps we just go back to calling it, regular old, good old-fashioned, DATA.

The Future of Enterprise Analytics

Over the last couple weeks since the 2016 Hadoop Summit in San Jose, eSage Group has been discussing the future of big data and enterprise analytics.  Quick note – Data is data and data is produced by everything, thus big data is really no longer an important term.

hspeopleeSage Group is specifically focused on the tidal wave of sales and marketing data that is being collected across all channels, to name a few:

  • Websites – Cross multiple sites, Clicks, Pathing, Unstructured web logs, Blogs
  • SEO –  Search Engine, Keywords, Placement, URL Structure, Website Optimization
  • Digital Advertising – Format, Placement, Size, Network
  • Social
    • Facebook – Multiple pages, Format (Video, Picture, GIF), Likes (now with emojis), Comments, Shares, Events, Promoted, Platform (mobile, tablet, PC) and now Facebook Live
    • Instagram – Picture vs Video, Follows, Likes, Comments, Reposts (via 3rd Party apps), LiketoKnow.it, Hashtags, Platform
    • Twitter – Likes, RT, Quoted RT, Promoted, Hashtags, Platform
    • SnapChat – Follows, Unique views, Story completions, Screenshots.  SnapChat to say the least is still the wild west as to what brands can do to engage and ultimately drive behavior.

Then we have Off-Line (Print, TV, Events,  etc). Partners. 3rd Party DataDon’t get me started on International Data. 

Tired yet?

blog

While sales and marketing organizations see the value of analytics, they are hindered by what is accessible from the agencies they work with and by the difficulty of accessing internal siloed data stored across functions within the marketing organization – this includes central corporate marketing, divisional/product groups, field marketing, product planning, market research and operations.

Marketers are hindered by access to the data and the simple issue of not knowing what data is being collected.  Wherever the data lies, it is often controlled by a few select people that service the marketers and don’t necessary know the value of the data they have collected.  Self-service and exploration is not possible yet.

Layer on top this the fact that agile marketing campaigns require real-time data (at least close real time) and accurate attribution/predictive analytics.

So, you can see there are a lot of challenges that face a marketing team, let alone the deployment of an enterprise analytics platform that can service the whole organization.

Now that I have outlined the business challenges, let’s look at what technologies were mentioned at the 2016 Hadoop Summit that are being developed to solve some of these issues.

  • Cloud, cloud, cloud– lots of data can be sent up, then actively used or sent to cold storage on or off prem.  All the big guys have the “best” cloud platform
  • Security – divisional and function roles, organization position, workflow
  • Self-Service tools – ease of data exploration, visualization, costs
  • Machine Learning and other predictive tools
  • Spark
  • Better technical tools to work with Hadoop, other analytics tools and data stores
  • And much more!  

Next post, we will focus on the technical challenges and tools that the eSage Group team is excited about.

Cheers! Tina

 

 

 

Get Marketing Insights Fast Without a Data Hostage Crisis

shutterstock_210349615The landscape for marketing analytics solutions is more cluttered than ever with multiple options and approaches for marketing departments to consider.  One option that we are seeing more and more of is a seductive offering that promises a simple, fast, nearly turnkey approach to getting analysis and insight from your growing stacks of data.  The offer is this: a vendor will import your data to their systems, do analysis on it with their in-house experts, and come back to you with insights that will help you run your business better.

No doubt, this is an attractive offer if you are like many marketing organizations, struggling to get internal resources to help consolidate data and do the analysis required to get you the insights you need.   Business Intelligence resources are hard to find in your company, the data holders in IT are backlogged and short staffed.  You need insights now to help engage and sell to your customers and are done waiting on internal resources so why not go this route?  While likely a quick, tactical solution that will get you answers in the near term, there are several major drawbacks to this solution as a longer term strategy.

Market leading organizations know that their data is a significant asset that, when used well, can help them better understand and engage their customers, anticipate customer needs, cross sell, upsell, and stay ahead of the competition. As part of making data a core competency, your organization has to do the hard work to intimately know its data, its strengths, its shortcomings, and understand what it can tell you about your business.  That intimate understanding of data only comes from digging in, “doing the homework,” investing in the infrastructure and skillsets to excel at business intelligence inside the organization.  Organizations that have this kind of understanding of their data are continually improving the quality of data in their organization and building the kind of sustainable internal BI capability that actually adds significantly to the value and sustainability of the company.  C-suite, take note!

If you outsource that knowledge, you may get the answers you seek fast, but you do not get the sustainable, growing capability in-house that becomes a core differentiator for your company and helps you lead the market. datahostagecalloutI’m amazed when I hear this but it is very common practice. What if your vendor company goes out of business, gets acquired, changes business models or you decide to change vendors?  Your vendor is holding your data hostage.  What are you left with then?  All the money you spent bought you yesterday’s insights but you have no investment or capability towards the future.   Your team has none of the knowledge or infrastructure to sustain and continue to grow that flow of business intelligence that is critical to serving your customers and staying ahead of your competition.   You are back to zero.

Fair enough you say, but damn it, I still need insights now and I can’t wait any longer.  Tactical and non-sustainable is better than nothing right?  Well consider that it doesn’t have to be an “all or nothing” approach.  There is a way to get fast and sustainable.  You can start with a partner who gets you to the critical insights you need now, but is doing it on your systems, building out infrastructure you own (be it in the cloud on your behalf or on premises), and is helping mentor your team members along the way.  You may spend a little more along the way to do this, but in this approach you are investing, not just paying a monthly fee with no incremental addition of value to your company.  Very quickly you will be way ahead.

If the vendor you pick, in this case, goes out of business, moves on, or you decide to part ways, there may be some short term pain, but you own the assets, data, and business logic they built and you have team members who have been working directly with the technology and data, “doing the homework”, and can keep you moving forward.  Nobody has your data held hostage.

The right choice for a vendor should:

  • Have deep experience utilizing the cloud to get you up and running fast, with limited need for hardware purchase and support.  The cloud is great but make sure it is your, cloud, not someone else’s.
  • Work with you to understand your unique needs, data, internal team skills and challenges, and creates a roadmap to Business Intelligence ROI internally.
  • Provide all the senior BI talent you need now to get answers fast, but also help you grow that skill in house, with training, new employee interviewing and ongoing mentoring.  They need to have a demonstrated understanding that knowledge transition to your team is part of the deliverable and be committed to providing it.

Pick a partner who can help you avoid having your data taken hostage, while getting you the insights and ROI you need fast!

Written by Duane Bedard, eSage Group President and Co-Founder