Solutions Architect

bei Snowflake

Remote
Information Technology
Information Technology & Services
Software Architecture
Trade

Beschäftigungsart:

Gleitzeit
Vollzeit

Fähigkeiten:

Apache Spark
Python
Sql
AWS
DevOps
Platform as a service
Oracle Database
GNU parallel
Parallel
Orchestration
Apache License
Apache Http Server
Replication
OCaml
Scalability
Data management
Rank
monthsOfExperience: 144
Apache Hadoop
Database normalization
Apache Parquet
Hadoop
Apache Avro
Veröffentlicht am:
Bewerbungsfrist:

Who We Are

We are technology thought leaders dedicated to mobilizing the world's data.

What We Do

We provide expert advisory services and act as trusted technology advisors to our clients.

What a Solutions Architect at Snowflake Does

As a Solutions Architect, you will:

  • Engage with several Snowflake customers.
  • Lead technical streams of client data platform implementations and on-boarding efforts.
  • Work with the other Customer Engagement and Delivery matrix resources to ensure proper technical guidance, project management and functional support.
  • Collaborate as needed with different Snowflake organisations: Engineering, Support, Sales and Marketing.
  • Help clients troubleshoot their implementations and integrate the product within their ecosystem.
  • Identify, document, triage and track issues to ensure resolution.
  • Actively contribute to the growth and scalability of the Customer Delivery and Training and Education teams through robust documentation, continuous process optimization, and capability cross-training.
  • Apply industry/domain/technology expertise to client implementations.
  • Gather intelligent product feedback and recommendations from customers to design and inform new features and capabilities.

What We Are Looking For

We are seeking candidates with expertise in cloud and data platforms, and 12 years of experience. Key qualifications include:

  • Experience with distributed systems and massively parallel processing technologies and concepts such as Snowflake, Teradata, Spark, Databricks, Hadoop, Oracle, SQL Server, and performance optimisation.
  • Knowledge of data strategies and methodologies such as Data Mesh, Data Vault, Data Fabric, Data Governance, Data Management, Enterprise Architecture.
  • Understanding of data organisation and modeling concepts and techniques such as Data Lake, Data Warehouse, Medallion architecture, Kimball dimensional modeling, and 3NF database normalisation.
  • Familiarity with infrastructure concepts such as the Cloud Hyperscalers (e.g. AWS and Azure) fundamentals of IaaS, PaaS, Networking, Security, Encryption, Identity and Access Management, and Disaster Recovery Planning.
  • Proficiency in Data Engineering concepts and frameworks such as batch processing, stream processing, replication, SQL, DBT, Talend, Informatica, Python, Snowpark, PySpark, DataFrames, storage formats (e.g Parquet, Avro, Apache Iceberg, Delta Lake), Orchestration and DevOps.
  • Experience with Business Intelligence and analytics solutions such as Tableau, PowerBI, MicroStrategy, Thoughtspot, SAS, Streamlit, and techniques such as time series analysis, Advanced SQL, and statistical analysis (e.g linear regression, variance analysis, modeling, and forecasting).
  • AI/ML fundamental understanding of key concepts such as classification, regression, clustering, dimensionality reduction, Natural Language Processing and Language Models.

Presence in the office once a week is expected.

What We Are Offering

  • A supportive environment where we help each other to stay on top of the game.
  • A competitive compensation package.
  • Unique benefits like SnowBreaks.
  • Very attractive parental leave.

Finally, we are looking forward to speaking with you!