The Great Data Migration – Part One

June - A Field Guide - Blog Banner.pngI don’t care who doubts our ability to succeed, as long as it’s nobody on my team.
– Kobe Bryant, Los Angeles Lakers Guard

Prepare For Takeoff

Everyone, these days, is jettisoning on premise storage and sending their data to the cloud. The reasons are varied, but generally come down to two factors: time and cost. Cloud storage, from any of the major providers like Amazon or Microsoft, can cost less than $0.02 per GB per month. Compare that to Apple’s revolutionary magnetic hard drive that debuted in 1981. It had 5MB of storage and cost $3,500, which is over $700,000 per GB. Ok, there was no monthly fee, but I digress. 😉 Time is usually how long it takes to get a new server, file share, or document repository installed in your corporate headquarters vs simply storing new data in Amazon S3 or Microsoft Azure. Or perhaps it’s the amount of IT resources that are needed to keep depreciating and out classed data centers up and running.

There are many advantages to cloud storage, which won’t be rehashed here. If you need a refresher (or convincing), this site may come in handy:

For the moment, let’s assume that you have decided to move your data to the cloud. This article will help you decide where to move it, how best to do so, and an ideal way to keep it updated and fresh.

 Where To Go?

In today’s cloud landscape, there are two players: Amazon and Microsoft. There are others, such as Google, but Amazon Web Services (AWS) and Microsoft Azure hold the keys. In addition to storage, they both offer services such as Virtual Machines, Caching, Load Balancing, REST interfaces, Web hosting and more, which can handle your other applications, should you need to migrate them to the cloud in the future. There are pros and cons to each, but both will handle your data securely, provide timely and cost effective access, and transparently maintain ready to use backups in case of unforeseen events. Let’s break them both down:

blog - aws s3AWS S3 (Simple Storage Service) is, as the name states, pretty simple. It has a free tier with 5GB of data and then breaks down into 3 categories – Standard, Infrequent Access (IA), and Glacier. If you just need to stash old data in the cloud and have no idea how it will be used in the future, use IA or Glacier for extremely cheap storage. Glacier is only $0.004 per GB vs Standard at $0.023 per GB per month (US West Region). The trade of with Glacier and IA is that it takes a little longer to get at the data you want to use, anywhere from a few minutes to several hours. Data can be moved up down from Standard, IA, and Glacier tiers so, for instance, those old application logs that no one was using can quickly be made available for reporting when needed.

Standard storage is what most people use for quick access to data. For the first 50TB per month, the price is $0.023 per month (US West Region). Anything can be stored here, such as images, binary files, text data, et. AWS S3 uses “buckets” to contain data, which can have an unlimited number of object. Each object within a bucket is limited to 5TB in size. For a breakdown on AWS S3 pricing, go here:

We’ll discuss how to migrate data to S3 a bit later. For now, know that access to your S3 data is through the AWS web console and a variety of APIs, such as the AWS S3 API, the s3:// or s3n:// file protocols, et. AWS S3 is secure by default, with only the bucket / object creators having initial access to the data. Permission is granted via IAM roles, secret / access keys, and other methods that are out of scope for today. A good primer for S3, including security, can be found at the S3 FAQ:

blog - azure storageAzure Storage has a lot more options that AWS S3. This can be confusing at first but also offers the most in terms of flexibility and redundancy for your data. There is a small free tier, and as a counterpart to AWS Glacier, Azure Storage offers “Azure Cool Blob Storage” for your archival, compliance, or other “don’t use but can’t throw away” data. Prices are usually less than $0.01 per GB per month in some regions.

Unlike S3, Azure Storage comes in several flavors of redundancy, so one can choose how many backups of their data exist and how easily they are accessed. If you have easily replaceable data, say from a 3rd party API or data source, then choose the cheaper LRS (Locally Redundant Storage) option which will keep local copies of your data in a single Azure data center. Need a durable, always available, “a crater can hit my data center yet I’m still OK” option? Then RA-GRS (Read-Access Geographically Redundant Storage) is the preferred option. This will ensure that copies of your data are also maintained at a second data center hundreds of miles away, yet always available for easy access. Middle ground hot and cool options exist as well. For a breakdown of Azure Storage pricing, please visit here:

Note: AWS S3 is functionally equivalent to Azure Storage GRS (Geographically Redundant Storage), so use this option when comparing prices.

Azure Storage uses “containers”, instead of buckets like S3, and containers can contain blobs, tables, or queues (discussed below). There is no limit to object size, yet each container can “only” hold 500TB. However, there can be multiple containers per storage account. Access to data is through the Azure Portal, Azure Storage Explorer, PowerShell, the Azure CLI (command line interface), and other APIs and file protocols.

Aside from regular blobs, there are a couple different types of data that can be stored in containers: tables and queues. Think of both as convenient layers laid over the raw data, to ease read and write access for different applications and scenarios.

 blog - azure storage tablesAn Azure Storage (AST) is essentially a NoSQL key-value store. If you’re not sure what that is, then you likely don’t need it. 🙂 NoSQL data stores support massive scalability yet the dataset and server sharding that is normally necessary (and a headache) for this is handled for you. AST, like other NoSQL datasets, supports a flexible schema model which allows one to keep customer data, application logs, web logs, and more – all with different schemas – in the same table. Learn more here:

blog - azure Storage QueueAzure Storage Queue (ASQ) provides cloud-based messaging between application components. Having a central messaging queue is critical for different applications and parts of applications that are often decoupled and need to scale independently of one another. This would only likely be needed if you have applications which currently store message data on-premises, but needs to be migrated to the cloud. The size of each message is limited to 64K, but there can be an almost unlimited number of messages (up to the container limit). Learn more here:

Another option unique to Azure in general, is the ability to link your corporate Windows Active Directory or Office 365 Active Directory with the Azure Active Directory (Azure AD). This feature, called Azure AD Connect, allows SSO (single sign on) between on-prem and cloud based applications and services. It is easy, for example, to quickly setup permissions and roles that manage access to essential services and storage across your organization.

For a rundown on security and encryption options, please visit:

This is great for raw storage, but now what about my DATABASE?

Almost every organization has relational data. While this can be extracted and placed into raw storage, it’s often easier to just lift it entirely into the cloud and go from there. Both Amazon and Azure platforms support numerous relational database hosting options, from Azure’s SQL Database to AWS’s Relational Database Service and many options in between. We’ll look at some of them here:

blog - aws rds.pngAWS Relational Database Service (RDS) allows easily deploying and scaling relational databases in the cloud. It frees one from the hassle of managing servers, patching, clustering and other IT heavy tasks. Also, it supports six different flavors of database: Amazon Aurora, PostgreSQL, MySQL, MariaDB, Oracle, and Microsoft SQL Server. One unique option is the ability to offload read traffic to one or more “Read Replicas” and thus increase availability and performance of your primary database instance. Database security differs depending upon your database flavor (some support encryption at rest, et) but the AWS RDS itself can be secured by being deployed within an organization’s AWS VPC (Virtual Private Cloud). In my opinion, AWS as a service has a simpler approach to security than Azure, because more AWS services can be setup behind the VPC, which acts as a gateway to sensitive data and applications. Learn more about AWS RDS here:

blog - aws redshift.pngAWS Redshift is Amazon’s data warehouse in the sky. Essentially a supersized PostgreSQL, it provides scalable, cost-effective SQL based storage that includes queries that can run both on S3 (via Redshift Spectrum) and Redshift. It stores data in a columnar-based fashion, giving fast query times over massive amounts of data. It might be overkill if your dataset is small, but if you have petabytes (or exabytes) of structured data that need analyzing quickly, Redshift can likely handle it. Start with AWS Redshift’s home page here:

blog - aws athenaAWS Athena is a new service which attempts to blend the raw and relational data worlds together. Simply point it at an S3 bucket, define a schema, write your SQL query, and go. You only pay for the queries run on the raw data and the schema definition can be reused with other queries, modified for another run, or simply tossed away when finished. Also, Athena can turn around and store the results back into AWS S3 or be used by another workflow to furnish data. By not having a permanent relational layer, data workflows and ETLs have less steps and less points of failure. Learn more about AWS Athena here:

blog - azure SQLAzure SQL Database is as simple as it gets: An SQL Server database as a service. No Virtual Machines or licenses to manage, no patching, lots of redundancy, and fast performance. One can pay per database or in “elastic pool database units” (EPDUs) which can be spread resources across many databases. Azure SQL database is meant for small to medium size databases (up to 50GB) that can operate independently from one another, as in a multi-tenant application or reporting solution. If you have a lot of SQL data to migrate, it is a good idea to break it up and store it along date or business lines or make the jump to Azure SQL Data Warehouse, a larger service meant for enterprise intentions (see below). Connections are made through a standard ODBC / JDBC connection string, with the URI as the service endpoint for your database.

Keep in mind, that this is not the same as full SQL Server. Since there is no real “server” involved, most system stored procedures (DBCC, et) and distributed transactions won’t work and SQL Server peripheral services, such as SQL Server Reporting Services (SSRS) and SQL Server Integration Services (SSIS) are not included. These voids can be filled by other services in the Azure stack; however, or by running a full copy of SQL Server in an Azure Virtual Machine (see below). Although a look at Azure Analytics is out of scope here, you should know that Azure supports an entire range of analytical services which can consume data from Azure SQL databases. Learn more about Azure SQL Database here:

blog - azure SQL Data WarehouseAzure SQL Data Warehouse (ASDW) is for enterprise grade data warehouse deployments. Like AWS Redshift, one can scale compute and storage independently and prioritize analytical workloads or long term storage. It also lets you pause the compute side entirely, turning ASDW into an archival data store when analytics aren’t needed. It also leverages Microsoft’s PolyBase service, which allows queries to execute against other data sources, such as Azure Data Lake, Azure HDInsight, or even another on-prem SQL Server data warehouse. Unlike Azure SQL database, Azure SQL data warehouse stores data in a columnar-based format, for maximum performance and scalability. Learn more about the Azure SQL Data Warehouse here –

 Rolling Your Own – AWS and Azure Virtual Machines

Of course, if you need advanced SQL Server features, such as SQL Server Analysis Services (SSAS), or want to run a completely different database type, you can always spin up a Virtual Machine (VM) and install the relational database software there. Many VM images from both AWS and Azure come with SQL Server, Oracle, or other software preinstalled, so all you need is your licensing information. Also, some images include the cost of the database software, effectively renting the license to users for a monthly fee. This can be useful, for example, if you would like to try out the features of SQL Server Enterprise before making a full purchase. Virtual Machines are also useful when ETLs and data workflows need to also be migrated to the cloud, as the VM can simply host the software required to run it.

NOTE: When you go the VM route, you are usually responsible for hardware provisioning/formatting, software patches, service upgrades, and maintaining secure access (through firewall rules, etc) to your system. A good pro/con for evaluating Azure SQL Database vs SQL Server on Azure VMs can be found here:

Now, how do I get it up there?

Alright, you’ve chosen your cloud platform, you know what data to move… or do you? How do you prioritize what goes and what stays? Stay tuned for The Great Migration – Part II, where I’ll cover next steps in how to lift your data into the cloud.

Hope this helps and happy migrating! Feel free to email me at with any additional questions!

Sincerely,  J’son 


Executive Evening Out with eSage Group and Microsoft

On March 10th, eSage Group held its first Executive Evening Out at the exclusive Rainier Club in Downtown Seattle. The event was sponsored by Microsoft Advanced Analytics.

On March 10th, eSage Group held its first Executive Evening Out at the exclusive Rainier Club in Downtown Seattle.  The event was sponsored by Microsoft Advanced Analytics.

15 Seattle area executives, from the likes of Starbucks, Trupanion, Allrecipes, Alaska Air, Disney and their guests joined us for a short presentation by Shish Shridhar, Worldwide Director for Business Intelligence Solutions – Retail for Microsoft, then sat down to a 5 course meal with wine pairings presented by The Rainier Club sommelier.

Microsoft has a powerful offering, from Azure Machine Learning, Cortana Analytics Suite, SQL 2016 and PowerBI. It was definitely a learning experience along with a wonderful meal and wines.


Immediate Job Opening – Mexico Based Microsoft BI Stack Engineer

Microsoft BIThe eSage Group is a Marketing Data Analytics firm established in 1998, headquartered in Seattle, Washington, USA. We have Fortune 500 clients including Disney, the LA Times, and Microsoft.

 Our company is always on the lookout for talented developers at all levels in both Mexico and the US. We have worked hard to create a company culture of sharp, quick learning, hardworking professionals who enjoy being part of a winning team with high expectations. As such, we hire self-motivated people with excellent technical abilities who also exhibit keen business acumen and a drive for customer satisfaction and solving our client’s business challenges.

 We have a strong remote team in various locations in Mexico, including Monterrey, Aguascalientes, Mexico City, and Guadalajara. All employees work from home. All employees are full-time employees, not contractors.

We need  Mid-Level Software Engineers (3+ years of experience) with the Microsoft BI stack.esage 2logo

All candidates must have a strong interested in business intelligence and marketing analytics. They must be willing to work with other companies at the same time. They must have a strong desire to understand business problems and look to disparate data sources to integration to gain insights to help solve the business problems.

• Advanced English Skills both written and spoken
• Advanced Excel and SQL skills
• Strong past data analysis experience
• Good oral and written communication skills
• A keen eye for detail is required. This person must be extremely detail oriented
• Ability to produce high quality, accurate, deliverables
• Proven ability to work under pressure with deadlines
• Ability to learn quickly, follow direction, and execute tasks independently

Technical Skills
• SQL Server
Knowledge of databases, stored procedures, and writing T-SQL script. Should know about primary and foreign keys, indexes and why they are important. Should know how to use temporary tables and know something about the use of cursors within a script. Basic error handling within a script would also be nice.

• OLAP / Analysis Services
Ability to design and build an OLAP database within Visual Studio. Understand the following concepts: Data Source View, calculated measures, named set, referenced dimension, Measure Group. Able to deploy and process an OLAP database. Understanding of basic MDX and how it is different from SQL. Knows the difference between a set, a tuple, and a value.

• SSIS (Integration Services)
Experience designing SSIS packages that can extract, transform, and load data. Use of Data Flow and Script Task components. Able to use C# and variables within SSIS. Able to package and deploy SSIS components.

• C# / .NET Framework
This includes the ability to create custom classes, interfaces, and events. Understands concepts like inheritance, polymorphism, referenced assemblies, and delegates. Should be comfortable working with lists (.NET or custom) and familiar with the standard .NET types arrays, dictionaries, lists, structs. Knowledge of Generics is a plus.

Please email your resume to tinam (at) esagegroup (.) com and/or complete form below.


eSage Group is excited to be a part of the PSAMA MarketMix!

 MarketMixBig data offers a lot of promise and opportunities for improving the way we do marketing.  As floods of data pour in from social media, mobile, weblogs,  digital advertising, CRM, POS, etc., companies need to effectively store it and develop robust analytics to mine the data for knowledge. By gaining new insights, marketers can tailor our marketing message to provide customers with the most relevant information and better engage with them through the lifecycle.  But how do we manage this data to make it truly usable?  How do we avoid the perils that comes with identifying and gathering the data, putting the analytics system in place, and getting the right people in place, so we can turn the data into actionable insights?

eSage Group’s very own Duane Bedard will lead a panel discussion on this and more at the Puget Sound American Marketing Association MarketMix on March 20th.  Panelists include Shish Shirdhar from Microsoft, Romi Mahajan from KKM Group, and Adam Weiner from RedFin.

ShiSh Shridhar is the Retail Industry Solutions Director at Microsoft and is responsible for strategy around Business Analytics, Big Data & Productivity Solutions for the Retail Industry. ShiSh has worked in Microsoft for the last 16 years across several groups and geographies and has a passion for empowering organizations through collaboration, knowledge management and analytics. ShiSh contributes to Retail Industry magazines, blogs and maintains the Retail Industry twitter presence for Microsoft: @msretail . He also regularly speaks at Industry events. ShiSh loves working on innovative ideas and has a patent in the Social Media space. When he isn’t working he sails and windsurfs the waters around the Puget Sound.  Follow Shish on Twitter at @5h15h.

Romi Mahajan is an award-winning marketer, marketing thinker, and author. His career is a storied one, including spending 9 years at Microsoft, being the first CMO of Ascentium, a leading digital agency, and founding the KKM Group, a boutique advisory firm focused on strategy and marketing. Romi has also authored two books on marketing- the latest one can be found here. A prolific writer and speaker, Mahajan lives in Bellevue, WA, with his wife and two kids. Mahajan graduated from the University of California at Berkeley, at the age of 19 with a Bachelor’s degree in South Asian Studies. He also received a Master’s degree from the University of Texas at Austin. He can be reached at

Adam Weiner is Vice President of Analytics and New Business at Redfin, He leads the company’s efforts to use our proprietary data to build new products for the web and improve our real estate services. He is also responsible for identifying opportunities for business growth that align with Redfin’s overall mission to reinvent the consumer experience for buying and selling real estate. Adam joined Redfin in 2007 on the product management team and was one of the pioneers of the Redfin Partner Program for agents, in addition to our service provider directory, Redfin Open Book. Prior to Redfin, Adam worked at Microsoft in the SQL Server Division for 5 years. Adam graduated from Stanford with a degree in Symbolic Systems, and a concentration in Human-Computer Interaction. Follow Adam on Twitter at @adamRedfin.

You can still register for the event at!


Wow, is this really true? CMO Tech Spending Good For IT?

Blog“Wow, is this really true??….

A recent article by Doug Henschen in InformationWeek (see full text below) mentions a prediction from Gartner that indicates CMO’s will outspend CIO’s on technology by 2017.  Even if this prediction is anywhere near true, it would be amazing and really shows the influence that digital marketing and associated analytics technologies are having on this space.  Marketers are now truly learning the immense benefits associated with properly tracking and extracting value out of multi-channel marketing data.  CMO’s can now use technology to analyze what channels are working best to attract specific customer segments and how one marketing channel leads to engagements on other related channels.  So today you can now track who is engaging on your social media channels and also navigating to your web properties to learn more about your products or make purchases.  What all this means of course is that marketers will need to form even tighter relationships with the IT teams at the organizations where they work, to bring all the pieces together.  There will need to be hardware to host all the digital marketing tools, likely deployment of big data platforms like Hadoop to store and process all the data, then the hosting and maintenance of the associated data integration and marketing analytics tools required to extract value out of all this multi-channel marketing data. Marketing will need to work closely with IT to get this infrastructure in place, but there will be challenges when blending the two worlds due to communication and prioritization issues.  Marketers will need help bridging this gap to make sure that their requirements are being understood, the right tools are deployed to meet their specific needs, and new incremental functionality is being rolled out in a rapid Agile fashion.  The role of the marketer is evolving rapidly with the associated expansion in responsibilities and the requirement for a new level of integration with other teams like IT.

The good news is that these are exciting times for marketers, now we just need to hope that these new investments are spent wisely on quickly extracting customer insights from this data in a matter of weeks, not months, or years….”

Article: Why CMO Tech Spending Is Good For IT

As CMOs increase their tech spending, it’s up to IT to get wise to marketing’s ways.

By Doug Henschen, Big Data
February 04, 2013

By now, most CIOs have heard the Gartner prediction that chief marketing officers will outspend CIOs on technology by 2017. Whether or not you agree with that prediction, there’s no question that marketers are now influential technology buyers, even if they’re not taking swaths of responsibilities away from CIOs.

The onslaught of interactive marketing and digital commerce — starting with the Web and email and more recently venturing into mobile and social interactions with customers — is behind much of this technology spending. It has also put marketers under pressure to reconsider how they measure, manage and execute marketing across traditional channels, whether print, TV, radio, in-store, Web or call center.

Technology is finally doing to marketing what it did to financial markets two decades ago: driving it toward automation and real-time analysis, says marketing strategist David Meerman Scott, author of The New Rules of Marketing & PR.

On the automation front, specialized workflow systems with supporting asset management and collaboration features are helping marketers develop and execute campaigns at scale and with the speed demanded by fast-moving consumer trends. Analytics is bringing more precise measurement and, hopefully, better planning and smarter decision-making, both within individual marketing channels and across channels, letting companies adjust their investment mix for maximum impact.

Too many marketing organizations have been underinvesting in technology, relying instead on manual tracking using spreadsheets and email. But those old ways are breaking down given the demand for scale and speed. The future of marketing is going to be “much less art and much more science,” says Meerman Scott.

The science relies on analytics, requiring plenty of data. This should be welcome news to IT pros, who are in the best position to help with the inevitable data management challenges. But CIOs “can’t just wait for CMOs to say, ‘Please help us,'” warns John Kennedy, VP of corporate marketing at IBM.

Only 10% of 550 marketing execs surveyed by the CMO Council put a priority on improving collaboration with their IT organizations this year, despite the fact that 43% of them plan to hire marketing or customer analytics talent, 41% plan to deploy email marketing automation and a third plan to deploy website performance optimization and mobile apps. If marketing and IT teams don’t work together, this marketing tech spending won’t reach its full potential. “CIOs have a role to play at all levels of marketing, particularly in aligning the technology so it can scale,” Kennedy says.

Understand Marketing’s Needs

One of the problems is that CIOs often don’t get how the CMO role has changed. “The marketer’s role has expanded from just driving demand and sales,” Kennedy says. “Now they’re also creating content, customer experiences and customer engagement” through Web, mobile and social channels.

IT leaders must map what marketing does these days to the emerging technology that backs various functions. The three main categories of marketing management systems, as defined in Gartner’s Magic Quadrant, are CRM-based multichannel campaign management (MCCM), integrated marketing management (IMM) and marketing resource management (MRM). Automation and analytics show up in all three categories.

MCCM systems are used for such processes as managing loyalty-card programs and promotional content. Their decision-support capabilities let companies quickly change inbound and outbound marketing offerings. IMM systems track a project from start to finish: from developing marketing concepts and allocating resources (money and people) to creating, testing and executing campaigns and evaluating results and feeding that analysis back into the next concept-development phase. MRM systems have supported strategy and planning. They’re now moving into operations by incorporating creative workflow management, asset management and fulfillment capabilities for the logos, videos and other materials used in marketing campaigns.

IBM was among the first big tech vendors to jump into marketing technology when it acquired Coremetrics (Web analytics) in 2010, followed in short order by acquisitions of Unica (MCCM and MRM), Sterling Commerce (e-commerce) and, in 2012, DemandTec (marketing analytics) and Tealeaf (customer experience analysis). Adobe acquired Web analytics vendor Omniture in 2009, expanding into MCCM. Teradata acquired Aprimo (MCCM, IMM and MRM) in 2010. SAS acquired Assetlink (MRM) in 2011. has focused on the social channel with its 2011 acquisition of Radian6 (social analytics) and Buddy Media (social marketing). Last year, Microsoft acquired MarketingPilot (IMM and MRM) and Infor acquired Orbis Global (MRM).

In the latest big marketing tech deal, Oracle announced in December plans to acquire Eloqua, which does MCCM. Competitors are downplaying Oracle’s $871 million deal as mostly focused on business-to-business marketing, but Oracle could easily extend and integrate Eloqua’s technology with its other assets to address consumer marketing, says Forrester analyst Rob Brosnan. Combined with other Oracle cloud apps such as Oracle Fusion CRM, Eloqua has the makings of a fresh replacement for Oracle’s aging Siebel CRM platform, Brosnan says.

IT teams must understand the five groups within marketing organizations and the key technology investments they’re making, Forrester contends. The list starts with CMOs at the top and then moves down to brand marketers, marketing operations, relationship marketers and interactive marketers (see below). Analytics is on the list at every level, and automation is needed at every level below the CMO.

What Marketing Technology Buyers Need
Group Key Needs Representative Technologies
CMO and other marketing leadership Focus on all aspects of marketing. Key areas include measurement,   strategy and marketing optimization. > Marketing performance management
> Marketing mix modeling
> Attribution
Brand Marketer Focus on building the brand and creating compelling brand content. Work   with agencies, media buying firms and creative shops. > Brand measurement
> Marketing resource management (planning)
> Asset management and localization
Marketing operations Central organization that focuses on budgets, processes, vendor   relationships and fulfillment. > Marketing finance management
> Marketing resource management (workflow)
> Production and fulfillment management
Relationship Marketers Emphasize customer insight development and direct communications. > Descriptive and predictive analytics
> Campaign management and marketing automation
> Interaction management and contact optimization
> Event-based marketing
Interactive marketers Focus on digital advertising, itneractive marketing and emerging media   strategy. > E-mail, search, display, social and mobile
> Web analytics and online testing
> Behavioral targeting and recommendations
> Audience management
Data: Forrester   Research


The Red Cross Needed Help

At the American Red Cross, much of the marketing in years past was done locally, at more than 700 chapters, with little central coordination and control. Four years ago, however, new management brought in a team from the for-profit world to build a strong headquarters-level marketing organization.

More Insights

“We wanted marketing that’s consolidated, powerful and breathtaking, but you don’t get that when your efforts are fragmented,” says Banafsheh Ghassemi, a telecom industry veteran hired as VP of marketing-eCRM and customer experience.

The Red Cross wanted consistency not only between national and chapter-based marketing, but also across channels (TV, radio, print, direct mail, Web, etc.). Consistent “omnichannel” messaging and measurement is one of the hottest priorities in marketing.

The Red Cross had a mix of manual methods, point technologies and contract relationships with agencies. It lacked a marketing management system to track campaigns and return on investment by channel; those initiatives were tracked in paper notebooks, whiteboards, spreadsheets and email messages. The charity had a Teradata data warehouse, but it lacked campaign management tools and internal know-how for marketing segmentation and targeting.

Over the last three years, the Red Cross has filled many of its marketing technology gaps. It chose Aprimo, now owned by Teradata, as its marketing management platform. The charity’s brand and creative team uses the system’s automated review and approval workflows to develop marketing programs. The field marketing team uses the system to collaborate with Red Cross chapters about those programs and related priorities, planning and resource allocation.

The Red Cross is now deploying Aprimo’s campaign management features. That deployment was delayed for 18 months because the Red Cross learned that it needed a better handle on its data before it could do customer segmentation, testing and targeting internally. The IT team discovered that not every Red Cross line of business (disaster relief, support for military families, health and safety training, blood supply and international services) was feeding data into its Teradata data warehouse, and those that were using it weren’t defining data consistently. The Red Cross ended up hiring a VP of data strategy, Disney veteran Chris Taylor, to fix the problems. Poor data quality has been the bane of business intelligence and analytics projects for decades, one reason marketing teams need to work more closely with their IT colleagues.

Another data challenge is sheer scale. “Three years ago, we didn’t have the system or even the techniques in place to keep some of the data that we now know to be significant for effective targeting and segmentation,” Bob Page, eBay’s VP of analytics platform and delivery, said during a recent CMO Council webinar. While eBay had data on its marketing — mostly email campaigns and keyword buys — and the resulting transactions, it wasn’t keeping behavioral data such as clickstreams and site search records.

“We knew what products customers bought when they checked out, but we didn’t know how often they came to the site, how long they stayed and what they looked at before they bought,” Page said. “This behavioral data helps you understand interests, impulses and motivations … but it also explodes the amount of data you have to collect.”

redcross Analytics In Demand

Most companies are reluctant to throw out marketing technologies they’ve already bought; they’d rather build on what they have than start over. This preference explains why IT vendors are acquiring or building out their marketing capabilities to create a suite of products. Swedish insurance company Folksam, for example, uses Infor’s Epiphany CRM system. Epiphany added integration to Orbis Global’s marketing resource management system last year, and then in December Infor acquired Orbis Global. Around the same time, Folksam chose to replace an aging on-premises deployment of Aprimo’s software.

Folksam started deploying Orbis’s software on premises in October, and it expects to have it in production by May. The deployment will address outbound telemarketing, inbound call center, direct mail and email marketing. Folksam handles Web campaigns using a separate system, and its social network activity is limited to monitoring customer comments.

“All of our monthly plans and budgets for campaigns are set up in Orbis, and the campaign leader sets up the customer offer and what kind of media we’re going to use — whether that’s email or direct mail,” says Staffan Magnehed, Folksam’s director of CRM. “We also have to determine what the offer is going to look like, the budget, and how many different activities we’ll have in a multichannel campaign.”

Once the campaigns are planned and prepared in Orbis, the next steps of customer segmentation, targeting and campaign execution are handled in Epiphany’s Customer Interaction Hub, which combines automation and analytics capabilities.

Banks and telecom companies in particular are getting sophisticated with their promotional offers to new customers, says Forrester’s Brosnan. Those efforts might involve targeting certain customers to receive cross-sell or up-sell offers, waiting a certain amount of time and then triggering follow-up offers depending on the customer’s response. Such marketing was a manual process in years past “and would have taken place at a direct marketing agency or marketing services provider like Acxiom or Experian,” Brosnan says.

Now that Red Cross is on its way to resolving its data quality problems, it plans to handle customer segmentation, testing and targeting using Aprimo’s campaign management tools. The Red Cross used to outsource that work to agencies. “If I can have a system in front of me that lets me play with my segmentation and do testing at my fingertips, that’s a huge time savings compared to sending emails, submitting a work order and then calling to find out when that work will come back,” Ghassemi says.

Holy Grail: Knowing What Works

As companies market across more channels, it only intensifies an age-old problem: Which ads or promotions worked to drive a sale? Marketers refer to it as “attribution.” You might know, for instance, that a customer clicked on an email offer and bought something, but did a prior direct-mail offer, a search keyword buy, a Web or print ad, or a social media interaction make that customer more receptive to the email pitch?

Attribution determines where CMOs spend their marketing dollars. In a conventional, direct-attribution approach, companies look at each channel separately, measuring response within that channel without considering other efforts. Now companies are coming up with ways to study customer-interaction histories across channels and give credit where it’s due. But this has mostly relied on crude rules applied manually or replicated within marketing management systems. For example, a system might assign a sale to the first or last marketing touch, or it might average the attribution across all marketing interactions with a given customer.

Advanced analytics hasn’t cracked this nut yet, but vendors are trying. IBM, for one, last month released an Attribution Modeler application that uses advanced algorithms to assess the impact of efforts in each channel. uses SAS software for attribution analysis. “You need to understand the causation and correlations between digital and offline activities so you know what’s triggering which behaviors and which activity drives the next,” says Kerem Tomak, VP of marketing analytics for the online retailer.

Did display advertising drive traffic to search and then to a website, or did the interaction start with search? Is a visit to a website a failure if a customer puts an item in a shopping cart and then abandons it, or did the customer decide to go pick up the item at one of our stores? Can individual campaigns lose money while still moving customers a step closer to a valuable sale? is using its attribution modeling techniques to allocate marketing budgets by channel and to determine which products are best promoted through which channels — “building a bridge,” as Tomak describes it, between marketing and merchandising decisions.

“If you track interactions, attribute correctly and test your models so you know you can trust your analysis, then you have a very powerful tool that will help you orchestrate everything you do,” Tomak says.


This Isn’t A Threat

The complexity involved in knowing how best to market to your would-be customers is only growing. Forty-four percent of store shoppers surveyed during the last holiday season said their first step was to go to a store, but 20% said they went to that retailer’s website first and 10% said they started with a general online research. That’s according to a study of more than 24,000 consumers across 100 websites, 29 retail stores and 25 mobile sites, conducted by customer experience analytics vendor ForeSee.

Among mobile buyers, 43% said their first choice would be to buy in store, but they ended up buying through a mobile app, likely because the item was out of stock or they looked at the item in store and found a less expensive option online.

In this kind of multichannel environment, marketing is a whole lot more than a brand message. It doesn’t matter if the CMO or CIO is signing the checks, companies must use technology to interact better with their customers. They need to better understand their customers so that they can respond more quickly to changing buying patterns and please customers in every interaction, be it online, on the phone or in person. As IBM’s Kennedy puts it: “How we operate can be a bigger factor than what we communicate.”

IT pros shouldn’t see these marketing trends as a threat. The Red Cross’s marketing tech initiative meant spending more on the IT side and hiring a director of data strategy. The lesson: Whether they’re supporting customers or personalizing an email marketing message, great technologists remain at the center of making sure the customer experience lives up to the marketing promise.




With the proliferation of user interaction data available from digital sales and marketing channels, marketers now have more valuable knowledge than ever on their customers and how they interact across various marketing channels.  Sources such as ecommerce sites, online video players, digital advertising systems and social media platforms like Facebook, Google +, Twitter and Pinterest all are generating volumes of data that hold meaningful insights into customers.   But in order to leverage this valuable stream of information in this fast moving internet world, marketers need the ability to quickly pose questions, get answers from it and take action.

More and more organizations are storing data collected from their various sales and marketing channels, often referred to as “Big Data” within Hadoop.  In order to extract marketing value from this Big Data, organizations need to quickly extract the information out of Hadoop and into a usable format – one that allows for simple but powerful analysis leading to a better understanding of their customers, improved marketing and user experiences and ultimately improved sales.  Unfortunately Hadoop is not known to be able to quickly extract value from this valuable data.  Hadoop is a great platform for cost effectively storing lots of data, but quickly getting the information that you want out of it is not nearly as easy as marketers would like.  Worse yet, with the current shortage of developers with the appropriate experience on Hadoop; many organizations are left without the talent to get this data into a usable format so they can start to realize the value.

The eSage Solution

eSage Group has recognized the frustration with not being able to quickly leverage the knowledge stored within Big Data to target marketing campaigns, enhance the user experience on ecommerce sites, react to customer interactions on social media channels, etc.  Our focus is to rapidly develop useful tools that allow marketers to get at and extract powerful value in a matter of weeks instead of months or years.

Our technical teams, including Technical Business Analysts, work to quickly extract the most valuable data out of  Hadoop  and get it into the hands of marketers who can leverage it.  Our goal is to work with the marketing organization to understand what metrics will provide the insight required, then bridge the gap with the technical teams to access the required data and quickly get it into a format that will provide the most value to the business.

Many times this means that we extract meaningful aggregations of the raw data from Hadoop into a SQL Server data mart and OLAP solution. This allows us to perform detailed trend analysis, prediction modeling, and reporting, with results usually surfaced through familiar tools like Microsoft Excel, SSRS (SQL Server Reporting Services), and SharePoint. Our import and analysis processes are also automated and can run on a set schedule according to a client’s unique business needs.

Quickly Extract Value from Big Data
Quickly Extract Insights from Big Data

We are able to accomplish this through our unique in-depth understanding of the needs of marketers, our focus on high value rapid solutions that provide value in weeks instead of months or years and our substantial investment in building a Business Intelligence practice that not only has deep skills with the Hadoop platform, but also many years of experience with user friendly, yet powerful presentation tools from companies like Microsoft.

By focusing on the specific needs of the marketing team and using the right tool for the right job eSage is able to get extremely insightful Big Data analytics solutions built and deployed in incredibly short time frames.

For more information, contact eSage today!

eSage Group Becomes Hortonworks Systems Integration Partner

HDP ImageWe are excited about our partnership with Hortonworks and the value it adds to our customers needing to unlock the insights that are in Big Data!

eSage Group leverages Hortonworks Data Platform to integrate Microsoft Office and Server tools with Big Data – extracting valuable marketing insights in just weeks

SEATTLE – June 13, 2012 – eSage Group, an enterprise business intelligence consultancy, today announced it has become a Hortonworks Systems Integration Partner.  Hortonworks is a leading commercial vendor promoting the innovation, development and support of Apache Hadoop. The Hortonworks Data Platform is a 100 percent open source platform powered by Apache Hadoop, which makes Hadoop easy to consume and use in enterprise environments.

eSage Group is the first Hortonworks Systems Integration Partner that specializes in delivering sales and marketing analytics using trusted Microsoft tools such as Excel, PowerPivot, and backend applications like Microsoft SQL Server in which both IT and marketers are most familiar.  eSage Group helps organizations capture and analyze structured and unstructured data (Big Data) to gain new insights that were previously not available.

“eSage Group has the unique combination of business acumen and the technical expertise to harness the vast amount of data from disparate marketing channels and make it meaningful and actionable,” said Mitch Ferguson, vice president of business development, Hortonworks.  “By partnering with Hortonworks and leveraging the Hortonworks Data Platform for customer engagements, eSage Group can deliver increased value and data insights to their customers.”

Helping Marketers Make Business Sense Out of Big Data

To derive value from data residing in today’s vast number of marketing channels – web sites, social, CRM, digital advertising, mobile, and unstructured “Big Data” – organizations must develop robust business intelligence to extract hidden relationships across these channels and derive new insights. The result is more effective marketing campaigns, improved customer engagement, increased customer lifetime value and ultimately greater revenue.

eSage Group helps marketers bridge the gap between marketing and IT. By leveraging its Intelligent Enterprise Marketing Platform with decades’ worth of business intelligence expertise, eSage Group creates a roadmap for integration that allows organizations to capture key insights in just a few weeks versus months or even years.

“Savvy marketing organizations today have an unprecedented opportunity to leap ahead of their competition by tapping the valuable customer insight locked in big data,” said Duane Bedard, eSage Group President. “eSage Group couples an Agile marketing process with newly available tools to quickly and easily expose this information, allowing marketers to create targeted, rapid messaging that drives sales and engages their existing customers.”

About eSage Group

Founded in 1998, eSage Group is an enterprise technology consultancy that helps marketers make sense out of marketing data, including Big Data. Leveraging its Intelligent Enterprise Marketing Platform (iEMP) with business intelligence expertise, eSage Group helps clients create and implement an agile marketing strategy and infrastructure, enabling them to map key sales and marketing goals to actionable performance metrics.  With eSage, organizations can gain deep cross-channel understanding of their customers, track which marketing campaigns are meeting their overall marketing goals, and ultimately increase revenue.  eSage Group’s customers include Disney, Microsoft, Chase,, ViAir, Wireless Services, and many more.

About Hortonworks

Hortonworks is a leading commercial vendor of Apache Hadoop, the preeminent open source platform for storing, managing, and analyzing big data.  Our distribution, Hortonworks Data Platform powered by Apache Hadoop, provides an open and stable foundation for enterprises and a growing ecosystem to build and deploy big data solutions. Hortonworks is the trusted source for information on Hadoop and together with the Apache community, Hortonworks is making Hadoop more robust and easier to install, manage, and use. Hortonworks provides unmatched technical support, training and certification programs for enterprises, systems integrators, and technology vendors. For more information, visit

About The Hortonworks Systems Integration Partner Program

The Hortonworks Systems Integrator Partner Program was created to train, certify and enable a broad ecosystem of systems integrators to deliver expert Apache Hadoop consulting and integration services. For more information about the program, visit