Skip to content
View daph-td's full-sized avatar
🏠
Working from home
🏠
Working from home

Block or report daph-td

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Pinned Loading

  1. boringPpl/data-engineer-roadmap boringPpl/data-engineer-roadmap Public

    Learning from multiple companies in Silicon Valley. Netflix, Facebook, Google, Startups

    902 194

  2. boringPpl/data-science-roadmap boringPpl/data-science-roadmap Public

    Learning from multiple companies in Silicon Valley. Netflix, Facebook, Google, Startups

    620 126

  3. auto-rename-and-organize-videos-based-on-Netflix-ids-and-genres auto-rename-and-organize-videos-based-on-Netflix-ids-and-genres Public

    Toy problem: Practice generating fake data; then, rename and organize the data into folders of their genres

    Jupyter Notebook

  4. web-scraping-Linkedin-profiles web-scraping-Linkedin-profiles Public

    Problem: It takes 10s on average to skim through Linkedin profiles and copy that information into an excel sheet. To collect a large enough amount of data for analysis purposes, it will take time i…

    Python 2 2

  5. boringPpl/data-project-guideline-from-Netflix boringPpl/data-project-guideline-from-Netflix Public

    "Data science is such a nebulous term. To some, it means data analytics; to some it is synonymous to machine learning; others think there is a data engineering flavor to it. The wide spectrum of po…

    5 1

  6. forex-crawling-and-visualization forex-crawling-and-visualization Public

    This project collects the forex data in real time and visualize them on web browser based on user input. The crawler was deployed to gcloud to run 24/7 and the data is stored in MongoDB.

    Python 1