We are seeking a Snowflake Engineer / Architect to join our dynamic team! This position contributes to the assessment, design, build, and deployment of Data Cloud solutions utilizing industry best practices. You will be part of a highly innovative fast-paced team, incorporating and inventing emerging software technologies to deliver best practice Data Cloud solutions. The Snowflake Engineer / Architect will play a hands-on part in driving the architecture, design, and implementation of Snowflake for our clients.
Responsibilities
- Participate as a Snowflake Engineer in the project team and provide agile software delivery, best-practice expertise to deliver results for an organization
- Play a key role in delivering data-driven solutions for our client.
- Proactively develop relationships with clients to understand their needs and effectively translate those needs into outcome driven solutions.
- Provide expertise as a Snowflake SME to solve complex business issues that translate into data integration, processing, analytics and storage designs in Microsoft Azure.
- Design, develop, and operate highly reliable large scale data lake systems
- Embrace Snowflake innovations with open source standards and tool sets
- Ensure all deliverables are high quality by setting development standards, adhering to the standards and participating in code reviews
- Participate in integrated validation and analysis sessions of components and subsystems on cloud servers
- Ability to perform market research and make specific recommendations including a SWOT Analysis (Strength, Weakness, Opportunity, Threat)
- Current knowledge of IT and Industry current trends and best practices
- Provide deployment information technology planning and implementation support.
- Ensure successful analysis, design, evaluation, modification, testing, and implementation of agile software development solutions across the enterprise.
- Work closely with the Project Manager to develop detailed plans and assist with business process changes, problem resolution, and functional decisions.
- Proactively anticipate potential roadblocks and take corrective action to keep the project on track while escalating issues to project management as needed.
Qualifications
- Experience assessing, designing, deploying, and/or sustaining applications within the Snowflake Ecosystem including experience large scale data intensive distributed systems, especially in query engines, object storage, data warehouse, data lake, data analytics, SQL/NoSQL databases, distributed file systems and data platform infrastructure
- Expertise in Snowflake concepts like setting up resource monitors, integration with matlab, RBAC controls, virtual warehouse, query performance tuning, Zero copy clone, time travel and understand how to use these features
- Expertise in Snowflake data modeling, ELT using Snowpipe and other open source tools such as Azure Data Factory and Apache Nifi,, implementing stored procedures and standard DWH and ETL/ELT concepts
- Proven track record of successfully building data driven solutions using integration, big-data processing and database and storage technologies
- Experience establishing data quality processes, performing data analysis, participating in technology implementation planning, implementing data integration processes
- Experience in architecting data pipelines and solutions for both streaming and batch integrations using Azure Data Factory.
- Experience in Data Migration to Snowflake cloud data warehouse
- Experience in developing DBT models to transform data into useful, actionable information
- Expertise in designing ETL frameworks with several years of hands on experience in implementing several aspects of data pipelines including ingestion, transformation, data quality checks, monitoring, alerts, notifications etc.
- Understanding of enterprise data management concepts (Data Governance, Data Engineering, Data Science, Data Lake, Data Warehouse, Data Sharing, Data Applications)
- Significant experience in systems analysis and solution development of IT solutions to meet government business requirements
- Significant experience translating and mapping business requirements to technical requirements
- Experience in Microsoft Azure, Python, Java, C++, and SQL
- Experience developing solutions in at least 4 of the following technology areas:
- Government Cloud IaaS and PaaS (FEDRAMP Authorized solutions)
- Microservices and Containers
- DevSecOps
- Data Analytics
- ML/AI (Machine Learning / Artificial Intelligence)
Educational & Work Qualifications:
- 4 Year degree in computer sciences, Information Technology, or equivalent experience.
- Must possess IT-II security clearance or have a current National Agency Check with Local Agency Check and Credit Check (NACLC).
- DoD Approved 8570 Baseline Certification: Category IAT Level I
- No visa sponsorship available and must be able to pass a government background check (eligible to hold a government clearance).
CIYIS is an Equal Opportunity Employer and all Qualified Applicants will receive consideration for employment without regard to Race, Color, Religion, Sex, National Origin, Disability Status, Protected Veteran Status or any other Characteristic Protected by Law.