Databricks and Snowflake Subject Matter Expert (SME)

Cloud Computing Job
Home » Databricks and Snowflake Subject Matter Expert (SME)

Databricks and Snowflake Subject Matter Expert (SME)

  • Full Time
  • Anywhere

Cloud Computing Technologies

If you’ve got prior experience working with Data Cloud Platforms that you’d like to cash in on, we’ve got an exciting opportunity for you. Cloud Computing Technologies is looking for a Databricks and Snowflake Subject Matter Expert SME to add to our office family, and we’d be glad to hear from any and all interested applicants.

The Databricks and Snowflake Subject-matter Expert (SME) Job Requirements

  • 5 years minimum hands on experience with Databricks and Snowflake and utilizing them in an AWS cloud environment.
  • 5 years minimum of experience with coding in Python, or otherwise working with complex code in order to deliver large scale solutions across a distributed or local filesystem.
  • Extensive expertise relating to Hadoop (HDFS) and other prominent Big Data Technologies. Supplementary knowledge of convolutional neural networks (CNNs) and other AI and Machine learning technologies, with a focus on cloud-based implementations.
  • Bachelor’s degree in Computer Science or related field. (Masters preferred, extensive experience within the domain may waive educational requirements)
  • Extensive knowledge of, and experience with, different kinds of data tier technologies. We’ll also expect you to know, for example, where a Data Warehouse may be better suited to handle the needs of a particular use-case than a Federated Data Lake, and where the economics of a lake would actually make it more viable.
  • Exceptionally strong verbal and written communications skills. Effectively and concisely communicating ideas is the crux of your role here, so your proficiency as a communicator and overseer of our overall process will be a focal point.
  • Ability and willingness to take on a completely remote role.

The Benefits:

Given the mission critical nature of the role, we’re offering an industry-leading salary with all the perks you’d think would go with it, including health and dental insurance, auto allowance, self-care and wellness allowance, and more. You’ll also get five weeks of paid time off, and the option to have flexible hours (within a reasonable extent).

If you feel like you’re a good fit for this position, and you’d like to know more about what the role entails, please feel free to get in touch through our contact page. You can apply for the job through this page, or browse through other opportunities on our careers page. We at Cloud Computing Technologies can’t wait to have you on board, and we look forward to working with you!

The Role

Getting to the nature and responsibilities associated with the Snowflake and Databricks Subject-matter Expert (SME) role, you’ll mostly be working in an advisory capacity with the rest of our engineering and development staff. You’ll be tasked with providing architecture related guidance in the interaction with business users and migration teams. You’ll be consulting on planning, developing, improving and maintaining complex data tier technology standards, maintaining a bird’s eye view over everything and ensuring that our solutions meet the standards dictated by presently available technologies.

You’ll be acting as a consultant to understand the current topology of the applications (both Databricks and Snowflake) and making recommendations advanced cloud data tier and integration choices. We’ll also expect you to drive innovation by knowing the what’s what of the most recent Snowflake and Databricks rollouts. Whenever you see the opportunity to add to the conversation regarding something we have in the pipeline, you’ll be expected to research and present potential alternatives or faults, helping keep our solutions as current as possible. Naturally, good presentation and written and verbal communications skills go a long way in this regard, so you’ll do well to have them in your arsenal.

You’ll also be consulting on tasks that involve the migration of on-premises solutions to the cloud, and helping pitch alternative Platform as a service (PaaS) solutions to our clients where there are better options available. For these purposes, it’s expected that you have at least some level of familiarity with Azure ML and Google Cloud, as well as platforms and services like HDInsights and Synapese, aside from the requisite knowledge of Amazon Web Services, Databricks and Snowflake.

Of course, you’re also expected to keep yourself up to speed on developments concerning data tier technologies, especially those that may concern our clientele. If a particular client, for example, has a multi-terabyte database that’s starting to sprawl too much and requires migration to a larger tier of structure, we’ll need you to weigh in on their use-case specific requirements, and help them decide between the multiple options available to them. You’ll play a key role in strategic decisions concerning the migration process, and help us define a direction that works for the client. Once the development process begins, you’ll continue to support and assist the application and migration teams in designing their solutions. Where you see room for improvement, you’ll also work with our DevSecOps team to ensure that our process is as pristine as can be, and that it consistently and effectively incorporates principles and best practices to ensure we’re meeting the compliance needs of our client, and defining those needs to the best of our knowledge.

Since we’re investing heavily in beefing up our AWS related expertise, we’ll especially appreciate any prior knowledge or experience with AWS technologies when working with either of the two platforms the position is focused around. Relating to Databricks, for example, knowing how to use Databricks to prepare data for Amazon Redshift, using AWS Glue to make Databricks tables accessible through AWS services and ensure higher interoperability, and using AWS SageMaker to deploy machine learning models, all of that experience will help propel you forward in this role. You’ll be supporting and supplanting the work of data scientists and analysts working with our clients, so you should be prepared to leverage expertise relating to both structured and non-structured data as the need arises.

About Storage and Analytics for the future | Snowflake and Databricks

Cloud-based data warehousing has taken off in recent years, especially as the scope and feature set of cloud applications started to grow to where the cloud is now everywhere, not just somewhere above us. Snowflake was a pioneering service in this regard, as the first platform to truly separate computing from computer data storage. It could be argued that Snowflake single handedly orchestrated the revival of data warehousing, and its growth into a commercially viable technology at scale. As a data structure that is geared towards analytics and specifically business intelligence related practical implications, cloud-based warehousing allowed for KPIs and other markers of interest to be monitored and calculated irrespective of scale.

Where Snowflake helped bring to life the cloud-based application of an existing data structure, Databricks was a platform built around the Apache Spark unified analytics engine. Spark helped us reimagine how we map functions to clusters, giving us resilient distributed datasets (RDDs), which themselves serve as the building blocks for a framework without the issues conventional map reduce faces with regards to latency, especially at the Big Data scale on a distributed file system. Spark allowed us to reduce this latency by orders of magnitude, raising the bar for analytics firepower and broadening the scope of tasks that could be attempted by an analytics engine. As a platform that helps integrate Apache Spark into many different cloud suites, including Google Cloud, Microsoft’s Azure ML, and Amazon’s AWS, Databricks paved the way for Spark’s practical efficiencies to be introduced to business processes, allowing for an intelligence driven revolution within cloud computing.

Both technologies have contributed heavily to an effort to broaden the scope of present day business intelligence, letting machine learning tasks escape limits of physical hardware. As industry leaders within the domain of architecting and deploying cloud-based solutions ourselves, we’ve been watching Snowflake and Databricks both tackle analytics from a different starting point, and we feel like it’s time to bring in an expert in both platforms to help add to the knowledge base of our team.

Introducing the opportunity | Would you like to be Cloud Computing Technologies resident expert?

As our resident expert, you’ll be placed in a proactive, future seeking goal of improving our ability to integrate both of these platforms into our existing solutions, as well as any solutions we may deploy in the future. You’ll be consulting with members at each level of our software development and software engineering teams, letting them know what they can and can’t do with respect to the relevant platforms, as well as suggesting improvements after considering tradeoffs between both platforms, where both may be implementable. We’ll get into more detail about the role and what it entails, but first let us tell you a bit more about ourselves first.

About the company

Cloud Computing Technologies is a cloud solutions provider that’s been working with cloud-based technologies for just about as long as they’ve been around. We’ve handled tasks at every level, from systems architecture design, development, and deployment, to large scale migration, integration, and upscaling projects, as well as robust and dynamic solutions design. We’ve been around for a while, and handled a great variety of tasks, is what we’re saying, and now we’re looking to add to our team with a mind to face the future head on!

If you drop by any one of our offices in Arizona, Massachusetts, California, Colorado, or Washington D.C., you’ll likely witness a small but diverse team of dedicated professionals buzzing about the place, working, talking or just planning out their days. What you won’t see is a bunch of people sitting amidst grey walls at their desks bored out of their minds – we’re not about that life. What we want is to empower tomorrow’s leading professionals, and we wouldn’t be able to do that if we clung to the idea of a classical office environment. Initiative and a drive to innovate are the two things we look for in every member of our staff, so if you’ve got an abundance of either you’re likely an ideal fit for us!

We’re all about flexibility within the workspace, and that flexibility isn’t limited to just the choice to work from home or from the office. We believe that the office should be a place that welcomes you, and in doing so, brings out the best of your latent creative and innovative talents. It’s a place you enjoy being at, and a place that helps you be the best you can be.

Apply now for this once in a lifetime position!

If interested in looking at other positions, you may want to consider Databricks and Snowflake Subject Matter Expert (SME).

To apply for this job email your details to careers@cloudcomputingtechnologies.com

Cloud Computing Technologies is an employer that selects candidates based primarily on their skills and experience. We prioritize hiring individuals who best meet the qualifications for the position.