Job Position | Company | Posted | Location | Salary | Tags |
---|---|---|---|---|---|
Horizon Blockchain Games | Toronto, Canada | $81k - $100k | |||
Merkle Science | Bangalore, India | $81k - $168k | |||
OP3N WORLD | Los Angeles, CA, United States | $77k - $115k | |||
Aradena: Battlegrounds | remote | $90k - $180k | |||
Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Consensys | Remote | $120k - $220k | |||
OpenSea | Remote | $160k - $305k | |||
BitGo | Bangalore, India |
| |||
Rarible | Lisbon, Portugal | $91k - $153k | |||
OKX | Singapore, Singapore | $72k - $110k | |||
Aptos | San Francisco, CA, United States | $91k - $120k | |||
Glassnode | Europe | $72k - $75k | |||
Crypto.com | Hong Kong, Hong Kong | $54k - $95k | |||
WOO Network | Taipei, Taiwan | $140k - $150k | |||
Gauntlet Networks | Remote | $150k - $180k | |||
Ramp Network | Warsaw, Poland | $74k - $148k |
This job is closed
Lead/Senior Data Engineer
Responsibilities
- Ensure all teams are drawing data from a single source of truth
- Work with our current systems in GCP, Snowflake, and Fivetran, and help evolve them
- Ensure data integrity, so that data from many sources are transformed into a useful format, and is correct
- Automate as much of the data pipeline as possible
- Quickly debug data discrepanciesScale our data capacity by 10x for the next 2-3 years
- Implement tooling in Go and Typescript so all engineers can meter products with client libraries
- Design data backup and recovery protocols
- Ensure the data can be easily digestible, and usable, using a BI tool like Looker or Tableau within an acceptable period of time (realtime, daily, monthly, yearly)
- Help guide a small team of data engineers on roadmap, design, and implementation
What You Bring
- 2-4 years relevant industry experience
- Strong working knowledge of at least one data warehouse like Snowflake, Redshift, Clickhouse, etc.
- Ability to work cross-functionally well with other stakeholders and teams
- Strong programming skills, and not just a hacker
- Ability or high-willingness to quickly learn about blockchain technology, its data structures, schema, and ecosystem
- Excellent SQL skills
- Experience with distributed log systems like Kafka, Spark, etc. a plus
- Expertise working with large software systems in one of the major cloud providers: AWS, Google Cloud, Azure
- Strong understanding of building fault-tolerant, scalable systems
- Experience with at least one scripting language (Python, bash, Ruby, etc.)Big plus if you have experience working with Go, Typescript, or similar derivativesInterest in web3
What does a data scientist in web3 do?
A data scientist in web3 is a type of data scientist who focuses on working with data related to the development of web-based technologies and applications that are part of the larger web3 ecosystem
This can include working with data from decentralized applications (DApps), blockchain networks, and other types of distributed and decentralized systems
In general, a data scientist in web3 is responsible for using data analysis and machine learning techniques to help organizations and individuals understand, interpret, and make decisions based on the data generated by these systems
Some specific tasks that a data scientist in web3 might be involved in include developing predictive models, conducting research, and creating data visualizations.