SQL to Snowflake Migration

SQL Server to Snowflake Migration: Avoid These 5 Common Pitfalls

Delivering effective data and analytics through outdated on-premises infrastructures and software is a nearly impossible task. In contrast, Snowflake and other cloud solutions continuously deliver innovations that are immediately available to their customers.

Even the best on-premises technical teams are left months or years behind their competitors. In this unprecedented time of continuous technology revolution, organizations that invest in modern cloud data architectures are surging ahead of their competitors.

While the benefits are clear, migrating your on-premises SQL Server data platform to Snowflake presents a monumental challenge. Established data platforms are deeply interconnected with systems across the enterprise. Moving these capabilities to the cloud impacts an array of teams, processes, and related technologies. When embarking on your journey to the cloud, be wary of these common pitfalls.

Pitfall #1: Poor Planning

Ask a search engine or your favorite GenAI chatbot for advice on data platform migration and the common thread across every response is that planning is critical. That conventional wisdom is absolutely correct, and yet business and data leaders consistently short-change the planning phase of large migration initiatives.

There is an old adage that plans are useless, but planning is indispensable. Leaders often see planning as a low-value activity, knowing that things rarely go as predicted. What they miss is the benefit of gaining a deeper understanding of the key risks and success factors for their migration. When issues inevitably arise, leaders and teams are better equipped to deal with them, reducing negative impacts on quality, schedule, and budget.

Critically, planning tests assumptions. For example, views should translate easily from Microsoft to Snowflake. It’s just a view, right? This may be true in most cases, but proprietary features like SQL Server Indexed Views don’t have a Snowflake equivalent. While not every nuance of conversion will be discovered in a planning process, examining basic assumptions will head off major surprises.

Pitfall #2: Neglecting FinOps

Transitioning to Snowflake’s consumption-based cost model is a fundamental change from SQL Server. Data managers and administrators taking a business-as-usual approach to cost management often encounter unexpected charges or ineffective controls.

With on-premises SQL Server, costs are generally predictable with database resources sitting in an Always-On mode. Unexpected demand or fast-growing data volumes show up as poor system performance and lowered user satisfaction. Cost controls are centered on negotiating hardware costs, software licensing, and efficient administration.

While Snowflake does charge for data storage, the primary driver of cost is compute. Snowflake Virtual Warehouses provide elastic capacity for query execution, ETL processes, applications, and other workloads. When Virtual Warehouses are oversized or left running when not in use, costs can mount with no benefit to the business. An Always-On approach becomes costly and inefficient.

Snowflake provides a wide range of features to monitor and control costs. Virtual Warehouses can be configured to automatically start up and shut down based on demand. Highly volatile workloads can balance cost and performance with auto-scaling of multi-cluster Virtual Warehouses. Policies and alerts can be set to provide guardrails and hard limits where needed.

To be effective, data managers will need to reinvent their cost management mindset and processes, shifting from a focus on capital expenditure (CapEx) to operational expenditure (OpEx).

Pitfall #3: Inadequate Data Team Training and Support

On the surface, SQL Server and Snowflake are very similar. Snowflake is an SQL database and uses concepts of databases, schemas, stored procedures, etc. This first impression often leads data leaders and administrators to under-budget for training. When they get to the real work of a Snowflake migration, they find that they don’t have the detailed platform knowledge and skills to convert workloads and operate in the new system effectively.

Like a student driver oversteering an unfamiliar vehicle, undertrained data teams often spend too much time focused on areas that needed active management in SQL Server, but are handled automatically in Snowflake. For example, crafting and maintaining indexes to optimize performance is a routine activity in SQL Server. It is virtually non-existent in Snowflake.

In the context of platform migration, even well-trained teams benefit from support from experienced Snowflake architects and developers. Whether hiring for permanent roles or engaging professional services, bringing in talent with a history of building solutions in the Snowflake platform will enable your team to avoid common problems and deliver higher-quality solutions.

At a minimum, front-line database administrators and developers should get their SnowPro Core certification. This will provide them with the fundamentals to understand how Snowflake operates under the hood and help them adapt their approach to take advantage of Snowflake tools and features.

Pitfall #4: Overlooking Security and Compliance

Designing and implementing security controls and data protection is a critical part of any Snowflake implementation. In large organizations, this effort requires significant cross-team collaboration between technical, business, digital security, and often legal stakeholders. When migration programs wait to involve digital security and compliance stakeholders, it can cause severe delays and disruption to project schedules as well as exposure to cybersecurity risks.

In an organization of any size, administration of authentication and access control to data assets is a monumental task. In most cases, migration teams must reverse engineer under-documented security implementations and integrations with central authentication systems like Active Directory. While tools and methods are available to extract current state configurations for role-based access, this is only the beginning of the process.

There are some fundamental differences between how SQL Server and Snowflake manage privileges to access database objects. For example, Snowflake does not allow privileges to be granted to individual users, while SQL Server does. SQL Server does not have explicit schema-level privileges, while Snowflake does. Conversion tools can help translate your SQL Server access controls into a Snowflake equivalent, but thorough review and validation will be required.

Equally important to a successful data security strategy is data classification and protection. Include data tagging, row-level security, and data masking in your conversion plan. Explore Snowflake’s robust data protection features such as external tokenization as further enhancements.

Pitfall #5: The Myth of Lift and Shift

When considering a migration from SQL Server to Snowflake, the “lift-and-shift” approach is always part of the discussion. Proponents advocate for just copying the data to the new platform and repointing data pipelines and consumption tools. The dangerous and misleading word in that argument is “just.” In practice, there is no simple lift and shift.

Tools like SnowConvert are incredibly effective in automating the process of generating equivalent database objects and code from SQL Server to Snowflake. These tools deftly handle details like translating datatypes, converting stored procedure and function code, and converting tables and views to a Snowflake equivalent. Automation is a critical component of a successful migration at scale.

But are converted objects always equivalent? Let’s look at the case of views again. In SQL Server, data can be updated through simple views. A common practice in SQL Server is to build a view layer over a set of base tables. Users don’t interact with tables directly, but rather through views, including adding or updating records. This provides architectural advantages such as resiliency to changes in the schema — if the base table changes, the view definition can be changed to maintain compatibility for solutions using that data. Unfortunately, Snowflake does not support updateable views!

Even with a flawless automated conversion, any users, applications, or processes that depend on updateable views will not work with a lift-and-shift approach. Depending on how much your existing architectures leverage updateable views, effective conversion may require additional weeks or months of development, refactoring, testing, and validation in your migration.

This example is just one of several known challenges in migrating from SQL Server to Snowflake. Tools to automate code conversion can accomplish a successful lift and shift for significant portions of your platform. However, data leaders and teams must look beyond metrics of percent conversion of code objects to verify that the results are truly equivalent.

Trust the Snowflake Experts at Atrium

Migrating from SQL Server to Snowflake is a game-changer, unlocking the benefits of a modern data architecture in the cloud. Take steps to plan well, adapt your cost management strategies, invest in comprehensive training for your data teams, prioritize security and compliance measures, and recognize the limitations of a lift-and-shift approach.

Avoiding these common pitfalls and working with an experienced Snowflake consulting partner will accelerate your SQL Server to Snowflake migration and bring about a higher-quality implementation. Atrium’s expertise with the Snowflake platform and modern data engineering projects ensures a smooth and successful transition.

Contact us and take the first step in unlocking the full potential of Snowflake for your business.