Bytecode IO is a remote data analysis and data engineering consulting company. We provide our clients expertise in data analysis, data engineering, data platform architecture and business intelligence. We are looking for a savvy Data Engineer to join our growing team of analytics experts.
Our clients task us with managing billions of rows of data from dozens of sources, organizing them, visualizing them, and analyzing them to help inform both short- and long-term decision-making. We help customers solve their hardest data challenges with the best technology in the marketplace.
Want the freedom to work anywhere within the United States, on your own schedule and have the support of industry leading experts to achieve your professional goals? Come join the Bytecode IO team where data is our passion.
The ideal candidate is an experienced data engineer with a passion for building pipelines, optimizing databases and developing meaningful visualizations.
This role works with our high growth startup clients to build out their analytics platforms enabling them to make data driven decisions. It can be messy, but it’s our job to identify, pipe, store and visualize their data with the best technologies available.
The Data Engineer will work on multiple, concurrent initiatives and ensure optimal delivery architecture and best practices are being implemented. They must be self-directed and comfortable supporting the data needs of multiple clients.
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Guide customers on developing and governing their data model around stated use cases to capture business rules and SQL logic
- Take a consultative role and help clients with SQL transformations, rethinking their approach to building visualizations, API integrations, scaling and other technical concepts
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
- Develop actionable analytic dashboards and visualizations to help customers drive their business
Qualifications and Requirements
- Excellent communication skills and proven ability to work with customers to derive clarity from ambiguous goals and requirements
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Experience designing and building Business Intelligence solutions with tools such as Looker, Tableau, Qlikview, Chart.io, Sisense, Periscope, PowerBI or Spotfire
- Experience building and optimizing ‘big data’ pipelines, architectures and data sets
- Experience developing processes for data transformation, data structures, metadata, dependency and workload management
- Strong quantitative background and/or data science experience (preferably in a consulting or technology role)
These are the technologies we work with, so if you have experience that would be ideal. For those you don’t – we can help you become an expert.
- Relational SQL and NoSQL databases, including Redshift, Snowflake, MySQL, PostgreSQL, SQL Server, MongoDB,and Cassandra
- Data pipeline and workflow management tools: Azkaban, Luigi, and Airflow
- AWS cloud services including: EC2, EMR, ElastiCache, RDS, Redshift
- Stream-processing systems: Storm, and Spark-Streaming
- Languages: Python, Ruby, Java, and shell scripting
- Big data tools: Hadoop, Spark, and Kafka
If you are interested, please send your resume to firstname.lastname@example.org
This is a US based remote contract position.