Hyliion · Cedar Park, TX
Are you ready to “Change the World?” Does “Succeed as One Team” meet your definition of a key component to an amazing company culture? If “Acting with Integrity” and pursuing “Excellence in All We Do” motivate you to do your best work, then Hyliion may be the right fit for you.
Hyliion is looking for a Data Engineer (DE) to join our team in Cedar Park, TX. This full-time position offers the opportunity to be a part of a fast-growing company that is revolutionizing the trucking industry. You will be joining a growing IT team with the remit to build or deploy cloud first systems from ground up leveraging modern data stack. In this role, you will be working across functional lines to deploy, integrate and enhance data product portfolio.
- Architect solutions to efficiently obtain, process, and store large, complex data sets for analytics and ML applications. (E.g., Vehicle Telemetry/IoT Data, Business/Custom app data, files, images).
- Work with the team to design and build a scalable and resilient E2E Data & Analytics platform in Microsoft Azure ecosystem from the ground up.
- Establish best practices and maintain documentation with all technical details and proper visual diagrams of solutions for easy understanding.
- Work with various SaaS application vendors and Hyliion users to understand data integration needs and execute on those requirements. Contribute to developing internal processes to efficiently collect requirements.
- Prioritize access and audit controls while proposing and implementing solutions to ensure compliance and security.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Obtains input and negotiates with product and software development teams and delivers verified software features, components, builds and support to product teams and end users.
- Provides problem resolution leadership for complex systems with a high degree of ambiguity and often global business impact.
Job Knowledge, Skills, and Abilities Requirements:
- Demonstrated experience in building highly scalable architectures.
- Experience designing RESTful, versioned, API micro services.
- Proficient in CI/CD pipelines, Docker (Kubernetes) Container Orchestration, Routing Mesh, Message Bus, and API Gateways.
- Ability to work in a fast-paced environment and the skills to deal with ambiguity.
- Comfortable with learning new skills to deliver high quality solutions.
- Ability to work well under minimal supervision.
- A Bachelor’s Degree, preferably in Computer Science, Information Technology, or related field.
- 4+ years developing and supporting Data Integration solutions.
- 2+ years in Data Integration design and development.
- 2+ years Python/Scala/R.
- 2+ years on Azure Cloud Platform using Azure Data Factory, Event Hubs, Azure Stream Analytics/Kafka, Azure Functions, Azure. SQL/KQL, ADLS Gen2/Azure Data Lake, or equivalent cloud technologies.
- Automotive or Manufacturing experience is a plus.
- Understanding of open-source technologies in modern data stack.
- Knowledge of distributed architecture features and challenges.
- Experience in configuring Azure data lake and Azure SQL databases is a big plus (provisioning ADLS Gen2, configuring security, configuring users and roles in Azure Active Directory, provisioning databases, performance tuning, disaster recovery, auto-scaling, etc.)
- Skillsets used – Azure ADF V2, ADLS, Data Bricks, Spark SQL, Python/Scala, Azure SQL DB, Git.
- Knowledge of SCRUM and DevOps methodologies