Contributor: Matt McGorrey
Snowflake is more than just cloud data storage
Following Snowflake Summit 2022, it’s clear that the company wants to be the number one “data cloud,” and that means more than just storing data. Snowflake wants their platform to support the next generation of data science, analytics, and data-first applications. They announced Native Apps*(available on a by-request basis to select groups)—the ability to build and share full-blown applications entirely on Snowflake using tools such as Snowpark for Python and Streamlit. These apps can be shared, or sold, on the Snowflake Marketplace. Data Collaboration continues to be an overarching theme for Snowflake, so it’s no surprise that they’re pushing collaboration with applications too.
I spent much of my time learning about Snowpark for Python’s new capabilities, such as support for User Defined Table Functions (UDTFs), Batch UDFs, and Stored Procedures. These expand the possibilities from running Python functions in your Snowflake account. They’re run in their own Python environment using your Snowflake warehouse and have access to Anaconda libraries. This provides a ton of flexibility when developing data pipelines…and model inference? Or feature generation? There are many ways Snowpark for Python can be used and I’m excited about all of them!
I racked up over 65k steps across keynotes, expo-hall touring, hands-on labs, and presentations. A large part of that was the half-mile between the presentation rooms in Caesars Forum and the hands-on labs (HOLs) in the Flamingo hotel. Thank goodness for air conditioning!
The HOLs were filled with content and unfortunately never completed on time, but they mostly followed various quickstarts available online. I had two favorites. ML for Snowpark Python showcased how to deploy UDFs for model training and inference, use Apache Airflow (via Astronomer) to manage the pipelines for the model, and ultimately use the model in a Streamlit application. The other favorite was a dbt and Snowflake HOL, which showcased creating a sales dashboard in Snowsight with data pipelines developed using dbt.
Snowflake’s partner focus
Over the course of the event, Snowflake consistently put their partners on pedestals. They’re praising their partners such as Matillion and dbt because they want to see more products built on or in the Snowflake ecosystem and support more integrators (like Atrium!) as they find creative ways to use Snowflake.
Matillion stood out to me as an ELT tool that was simple on the surface but offered vast flexibility if required. ELT, Extract, Load, and then Transform, is Snowflake’s favorite paradigm because it can be commanded to transform data via REST, CLI, or the fun way with Snowpark. And with the introduction of Snowpark for Python, just about any crazy transformations you have in Python can be run directly in your Snowflake account.
dbt stood out because it’s a favorite at Atrium. Creating repeatable data pipelines, especially for data science initiatives, is a must. The framework dbt provides is excellent for developing and maintaining these pipelines and Snowflake is a great spot for them to live.
Data mesh with Snowflake
A theme that kept appearing across sessions, speakers, and questions was Data Mesh, a new paradigm for managing data across an organization. Snowflake is great for supporting a data mesh because of the ability to designate separate databases and warehouses to each domain, govern access with role-based access control and row-level security, and operate complex data pipelines to serve data products. An organization’s mesh can even expand to share data on the marketplace.
Atrium’s booth experience
Our booth was tucked away at the corner of the expo hall, but luckily we were located right behind Snowflake’s “theCUBE” stage. This meant that our logo, banner, and booth babe (Matt McGorrey) were visible in almost every guest interview. You can see this for yourself on theCube’s homepage (notice the Atrium logo and banner to the right of Snowflake’s CEO Frank Slootman’s face). As attendees watched the live interviews, it was easy to strike up conversations and have them interact with our demo.
Our Tableau “Oh the places you’ll go” demo showcased the power and speed of Snowflake’s integration with Tableau. After filling out a short questionnaire, attendees were able to immediately see the dashboard refresh with their information. Here you can see the final dashboard and read more about how it worked.
What drove attendees to stop by our booth?
There were two main things that drove people to stop by our booth and have a conversation: our Salesforce experience and our Tableau dashboards. To my knowledge, we were the only Snowflake partner that actively advertised having experience with Salesforce. Attendees would often walk by our booth and literally stop when they read the words “Salesforce and Snowflake”. This served as a natural conversation starter, and our team was able to provide Atrium’s Salesforce point of view and learn about several unique use cases surrounding different ways to integrate Snowflake and Salesforce.
The other main attraction at our booth was our Tableau Dashboards. Our aforementioned “Oh the places you’ll go” dashboard served as a great marketing use case: instantly collecting lead information and visualizing it in Tableau. Booth visitors appreciated the speed and elegance of the Snowflake x Tableau integration. Several Tableau pros were keen to point out how difficult the map widget was to create, and they were curious about what data transformations we conducted in Snowflake to make it happen. The other dashboard we demoed at the booth was our RevOps Performance dashboard, which showcases how data science can augment Tableau’s ability to highlight actionable insights.
We had a large number of Snowflake executives that passed by during their “theCube” interviews. It was great getting to photobomb them and to learn more about their varying points of view on how Snowflake will transform over the next year. I would say the most common theme talked about on stage was how Snowflake users can get more value out of having their data on the Snowflake platform – something Atrium excels at.
Lessons learned for next time?
While the dashboards served as a great interactive experience at Atrium’s booth, we did observe some “demo fatigue” towards the end of the conference. Almost every partner had some sort of demo, which meant attendees were constantly being bombarded by product/service showcases. Moving forward, we want the Atrium booth experience to be more memorable, so we are brainstorming other creative ways to draw attendees to our booth, especially later in the conference.
Learn more about what we can help you achieve with Snowflake.