Job Position | Company | Posted | Location | Salary | Tags |
---|---|---|---|---|---|
evolve24 | Herndon, VA, United States | $105k - $111k | |||
Anti Capital | New York, NY, United States | $18k - $36k | |||
Cash App | San Francisco, CA, United States | $138k - $169k | |||
Drift Labs | New York, NY, United States | $45k - $90k | |||
Learn job-ready web3 skills on your schedule with 1-on-1 support & get a job, or your money back. | | by Metana Bootcamp Info | |||
Storm2 | United States | $72k - $80k | |||
Afterpay | Los Angeles, CA, United States | $72k - $80k | |||
TRM Labs | Remote | $86k - $110k | |||
Immuna | Remote | $60k - $200k | |||
Arda | Remote | $20k - $100k | |||
Impact Theory | Los Angeles, CA, United States | $105k - $111k | |||
Fractal Wealth | New York, NY, United States | $25k - $60k | |||
Status | Zug, Switzerland | $54k - $80k | |||
Oracle | United States | $87k - $2k | |||
Cisco | United States | $16k - $25k | |||
Mammoth Media | Santa Monica, CA, United States | $18k - $27k |
This job is closed
TITLE: Data Analyst
LOCATION: Herndon, VA (Hybrid)
Responsibilities
- Collects and maintains data from operational systems and databases, use statistical methods and analytics tools to interpret the data, and prepare dashboards and reports for users.
- Experienced in data analysis, reporting (i.e., tableau), and data visualization
- Work across a wide range of data engineering tasks - data extraction, data cleansing/processing, feature engineering, data integrity analysis, maintaining datasets, etc.
- Work with internal and external APIs (e.g. Twitter, YouGov, etc.,) to push or pull data
- Prototype, architect, and build tools (web applications, software systems) to automate and/or scale processes
- Develop solutions and build prototypes to problems using algorithms or models based on machine learning, statistics, and optimization, and work with engineering to productionize those algorithms
- Conceptualize and create datasets from raw data to be used for analysis and/or model building
- Communicate findings to stakeholders to drive business and marketing decisions
- Write documentation of tools, methods, and datasets
Qualifications
- Degree in Computer Science, Physics, Machine Learning, Math, Bioinformatics, Engineering, or equivalent technical field
- 3+ years of work experience in data science, machine learning, data engineering, or analytics roles
- Proficiency in working with SQL and/or NoSQL database technologies
- Proficiency in Python (or R/Julia/Java)
- Working knowledge of algorithms and data structures
- Proficiency with toolkits such as NumPy, SciPy, Pandas, Scikit-learn
- Knowledge of underlying mathematical foundations of statistics, machine learning, optimization, and analytics
- Strong analytical and problem-solving skills, including the ability to apply quantitative analysis techniques to real-world business problems
- Experience with version control systems such as Git
- Ability to design and analyze incrementality experiments (AB test, etc)
- Familiarity with machine learning techniques and algorithms (k-NN, SVM, DT, etc.)
- Crypto experience is a plus
Applicants selected may be subjected to a government background investigation and may be required to meet the following conditions of employment:
- A Favorable credit check for all cleared positions
- Must be a U.S. Citizen
- Successfully passing a background investigation
What does a data scientist in web3 do?
A data scientist in web3 is a type of data scientist who focuses on working with data related to the development of web-based technologies and applications that are part of the larger web3 ecosystem
This can include working with data from decentralized applications (DApps), blockchain networks, and other types of distributed and decentralized systems
In general, a data scientist in web3 is responsible for using data analysis and machine learning techniques to help organizations and individuals understand, interpret, and make decisions based on the data generated by these systems
Some specific tasks that a data scientist in web3 might be involved in include developing predictive models, conducting research, and creating data visualizations.