I want to be a Snowflake DBA

I want to be a Snowflake DBA


There’s a lot of talk at the moment about how Cloud services will rapidly replace “On Premises” SQL Server. Snowflake, one of the major players in the Data Warehouse space offers just this, with the promise of “no requirement for a DBA”. Surely a tempting proposal, but should this be a concern for us?

I want to become an Oxymoron – a Snowflake DBA – some would say that I’m halfway there.

At a recent SQL User Group, I was chatting to someone about moving to the cloud when I received a classic response of “I would imagine that you should be fine sticking to On-Prem”.

Essentially, with my Age, I could see out my working years without ever needing to move to the Cloud. Trouble is, I don’t recognise my age as a barrier to what I do. I see the future and I want to embrace it, albeit maybe in a slightly slower, more cautious manner. I don’t want to be that HP3000 guy who makes a living from being the last of his species.

For me, I want to offer value in Database provision and support. This means using the Solution that best serves Business Requirements in the most cost effective and supportable manner. I also see reasons for companies to still be using DBAs, albeit with a slightly different focus.

Cloud based offerings are certainly attractive, certainly based on initial startup cost. You pay just for what you use and have the ability to ramp up the compute power as and when required. This is tempting – in years gone by, the initial cost of provisioning a Server for a new project could be prohibitive, especially when it needed to be oversized in order to meet future demand.

Cost is a big deal with SQL Server, especially with SQL Server Enterprise Edition.

How about in the Cloud? Welcome to a new model of “Pay As You Go”. This can work really well, but it needs to be managed carefully too to avoid suddenly being landed with unexpectedly large bills. This needs to be managed through careful monitoring and monthly budgeting. What happens when new requirements suddenly push up the Compute for more complicated processing overnight? That’s even before you consider the potential for “accidentally” leaving the lights on overnight – for example, ramping it up to 4x Compute and forgetting to switch it back down afterwards, generating a large bill in the morning… Or a few days later. OK, so it claims to not charge when not in use, but this needs investigation – especially where you have jobs that regularly poll for data availability, the typical way for a Data Warehouse to be loaded each day. All of this needs to be considered and managed.

Some of the “On-Prem” issues follow through to the Cloud, such as the importance of choosing the correct Datatypes. I have complained in the past about Developers being lazy when choosing Datatypes for example I found that someone used an Integer to store “Month of Year” whereas it could have been stored using TinyInt, making a considerable difference to the amount of storage required. If this happens in a traditional “On Premises” environment then this can cause performance issues, filling disks before their time. In the Cloud, that’s going to cost for more storage and maybe more compute – real money.

There’s a fine Dilbert cartoon where one of the characters proclaims “I’m gonna write me a new minivan this afternoon” after being told that he will be paid to fix bugs. I have visions of Developers suddenly running up huge bills by using incorrect DataTypes or including a nice big Cartesian Product.

One of the features of SQL Server Licensing that’s often overlooked is that Development and Test environments are free. You still need to pay for the Server and Windows, but no SQL Server Licence costs.

That’s different in the Cloud – you pay for everything. This in itself could make a big difference to costs and needs to be taken into consideration.

Perhaps we need a Test or Dev version of the code that is kept “On-Prem”, a “Pontypool Dry Ski Slope” if you will. Hopefully this will catch most of the accidents before they happen in the Cloud.

Snowflake uses a generic version of SQL, so we could use SQL Server for this, Developer Edition because it’s free – oh, the irony!

Security is also an issue. I am told that the Cloud based Data Centers have more protection than on-prem. But they’re still “out there”, whereas ours are just in the next building, below the gym. Somehow local feels better.

Either way, Security needs to be managed carefully.

Plenty of things to be considered and controlled, if only there was an existing role that’s been handling this type of work for years. Hold on, there is – a DBA.

Snowflake’s not the only game in town, but it is something that interests me, as do the other vendors coming our way.

To summarise, still alive just adapt and survive.


I had a few good responses from this one which I have pasted below:

David Brinn

I think there will always be a need for a DBA but the role is evolving and includes so much more that it used to. The days of the DBA being the insurance policy against failures and the go to person for that piece of code that runs like a dog is over. Instead the role includes DBA, development and BI work with a bit of project management and mentoring for good measure. I agree that we need to adapt to survive but at least that means we can work with the cool stuff. Great article.

Dave Postlethwaite

Well said. Crap code runs just as badly in the cloud as it does on premises but the cost will be more apparent in the cloud. If a DBA can improve performance in the cloud and reduce the number of DTU then he/she will have shown in monitory terms their worth

Leave a Reply

Your email address will not be published.

Follow on Feedly

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.