Immediate Job Opening – Mexico Based Microsoft BI Stack Engineer

Microsoft BIThe eSage Group is a Marketing Data Analytics firm established in 1998, headquartered in Seattle, Washington, USA. We have Fortune 500 clients including Disney, the LA Times, and Microsoft.

 Our company is always on the lookout for talented developers at all levels in both Mexico and the US. We have worked hard to create a company culture of sharp, quick learning, hardworking professionals who enjoy being part of a winning team with high expectations. As such, we hire self-motivated people with excellent technical abilities who also exhibit keen business acumen and a drive for customer satisfaction and solving our client’s business challenges.

 We have a strong remote team in various locations in Mexico, including Monterrey, Aguascalientes, Mexico City, and Guadalajara. All employees work from home. All employees are full-time employees, not contractors.

We need  Mid-Level Software Engineers (3+ years of experience) with the Microsoft BI stack.esage 2logo

All candidates must have a strong interested in business intelligence and marketing analytics. They must be willing to work with other companies at the same time. They must have a strong desire to understand business problems and look to disparate data sources to integration to gain insights to help solve the business problems.

• Advanced English Skills both written and spoken
• Advanced Excel and SQL skills
• Strong past data analysis experience
• Good oral and written communication skills
• A keen eye for detail is required. This person must be extremely detail oriented
• Ability to produce high quality, accurate, deliverables
• Proven ability to work under pressure with deadlines
• Ability to learn quickly, follow direction, and execute tasks independently

Technical Skills
• SQL Server
Knowledge of databases, stored procedures, and writing T-SQL script. Should know about primary and foreign keys, indexes and why they are important. Should know how to use temporary tables and know something about the use of cursors within a script. Basic error handling within a script would also be nice.

• OLAP / Analysis Services
Ability to design and build an OLAP database within Visual Studio. Understand the following concepts: Data Source View, calculated measures, named set, referenced dimension, Measure Group. Able to deploy and process an OLAP database. Understanding of basic MDX and how it is different from SQL. Knows the difference between a set, a tuple, and a value.

• SSIS (Integration Services)
Experience designing SSIS packages that can extract, transform, and load data. Use of Data Flow and Script Task components. Able to use C# and variables within SSIS. Able to package and deploy SSIS components.

• C# / .NET Framework
This includes the ability to create custom classes, interfaces, and events. Understands concepts like inheritance, polymorphism, referenced assemblies, and delegates. Should be comfortable working with lists (.NET or custom) and familiar with the standard .NET types arrays, dictionaries, lists, structs. Knowledge of Generics is a plus.

Please email your resume to tinam (at) esagegroup (.) com and/or complete form below.


A special TTT: Handy SSIS Trick – Use Temp Table for Schema Generation for Execute SQL Task

This is a bonus Tech Talk Thursday by J’son on a Tuesday!  We are a wild bunch!!  Enjoy!


Usually, I design my SSIS packages into two sections. A Staging Group that is responsible for bringing in source data and a Final Group that is responsible for normalization, data mart design, et. In the Staging Group, I either bring my source data in all at once or do an incremental pull – it really depends on the amount and type of data needed.

In both cases, I need to make sure that I have a table in my Staging database that is ready for the data I’m about to pull in. Let’s take a typical Staging process:

You can see that each individual import is made up of two parts – the Execute SQL Task is responsible for either drop and recreating or simply building once (if not exists) the SQL table needed for the data. Then the Data Flow task imports the data and fills the table.

The Problem

Lately, some of the SQL scripts I have been using to get source data have been getting seriously long with lots of columns needed, etc:

To get the script and  the rest of this post, download the complete Handy SSIS Trick – Use Temp Table for Schema Generation for Execute SQL Task Teck Talk.