Archetype’s Analytics Insights consulting practice develops insights for our customers through innovative technology solutions. Archetype sees the value in data science and self-service analytics, providing these capabilities to our customers. At Archetype, you will get to work with the best of breed products and services that include Snowflake, AWS, Azure, GCP Fivetran, dbt, Tableau, ThoughtSpot, Dataiku, DataRobot and partner with our strong growing consulting team.
What You’ll Do
Strategic Solutioning: Develop compelling proposals, presentations, and statements of work (SOW) to drive business growth. Identify opportunities for Archetype in data and analytics, both in existing and new environments.
Architectural Excellence: Lead the modern data architecture pillar, ensuring excellence in delivery, sales, people management, and operations.
Subject Matter Expertise: As a senior leader, engage with clients, influence executives, and drive customer-focused solutions. Leverage your deep understanding of data and analytics to deliver value.
Financial Insights: Conduct TCO (Total Cost of Ownership) and ROI (Return on Investment) analyses to guide technical decisions.
Platform Design: Architect, design, and develop end-to-end data and analytic platforms (e.g., data warehouses, data lakes, ML models) using cloud services.
Revenue Impact: Director hires will directly impact revenue up to $1M+, while Senior Director hires will have a responsibility of $2.5M+.
What You’ll Bring
Experience: 10+ years in consulting leadership, with at least 5+ years specializing in data architecture and engineering solutions within cloud environments (Azure, AWS, GCP, Snowflake).
Leadership: 4-8 years of team leadership, including technical pre-sales experience.
Effective Communication: Proven ability to convey complex technical concepts to non-technical stakeholders.
Business Development: Track record of growing existing client relationships and expanding new accounts.
Cloud Expertise: Deep familiarity with cloud platform analytics services, including storage, permissions, private cloud, databases, virtual machines, and parallel processing technologies.
Required Experience
8+ years as a hands-on Solutions Architect designing and implementing data solutions.
4+ years previous Consulting leadership experience working with external customers with the ability to multitask, prioritize tasks, frequently change focus, and work across a variety of projects.
Programming expertise in Java, Python, and/or Scala.
Core cloud data platforms including Snowflake as well as AWS, Azure, Databricks, or GCP.
SQL and the ability to write, debug, and optimize SQL queries.
Demonstrated expertise in effectively leading and managing a team comprising Solution Architects and Data Engineers, fostering internal growth through coaching, mentoring, and performance management.
Proven track record of collaborating with client stakeholders, technology partners, and cross-functional sales and delivery team members across distributed global teams, ensuring seamless, successful project delivery outcomes.
Create strong cross-practice relationships to drive customer success.
Exhibits a strong sense of ownership in resolving challenges, committed to ensuring exceptional outcomes for all aspects of project execution.
Ability to develop end-to-end technical solutions into production — and to help ensure performance, security, scalability, and robust data integration.
Client-facing written and verbal communication skills and experience.
Create and deliver detailed presentations.
Detailed solution documentation (e.g., including POCs and roadmaps, sequence diagrams, class hierarchies, logical system views, etc.).
4-year Bachelor's degree in Computer Science or a related field.
Preferred Qualifications (Nice-To-Have)
Production experience in core data platforms: Snowflake, AWS, Azure, GCP, Hadoop, Databricks.
Cloud and Distributed Data Storage: S3, ADLS, HDFS, GCS, Kudu, ElasticSearch/Solr, Cassandra, or other NoSQL storage systems.
Data integration technologies: Spark, Kafka, event/streaming, Streamsets, Matillion, Fivetran, NiFi, AWS Data Migration Services, Azure DataFactory, Informatica Intelligent Cloud Services (IICS), Google DataProc, or other data integration technologies.
Multiple data sources: (e.g., queues, relational databases, files, search, API).
Complete software development life cycle experience: including design, documentation, implementation, testing, and deployment.
Automated data transformation and data curation: dbt, Spark, Spark streaming, automated pipelines.
Workflow Management and Orchestration: Airflow, AWS Managed Airflow, Luigi, NiFi.
Methodologies: Agile Project Management, Data Modeling (e.g., Kimball, Data Vault).
Most of the work can be managed remotely but may require occasional travel to client sites.