#30DaysOfDataViz – Learn Data Viz

Mission: Uplevel your data visualization skills with #30DaysOfDataViz

3 Rules

The main rule: Create one data visualization every day for the next 30 days.

Further details:

  • Create one data visualization via Excel, Google Sheets, BigQuery, Python, R, whatever you choose every day for 30 days
  • Tweet your progress daily with the hashtag: #30DaysOfDataViz
  • Each day, reach out to at least two people on Twitter who are also doing the challenge and give feedback on their visualizations

Hint: If you need inspiration, there are 30 days worth of #30DaysOfDataViz examples below

Huge Thanks!: This challenge and page was heavily inspired by #100DaysOfCode, go check em out if you’re interested!


Publicly commit to the challenge:

If you made the decision to commit to the challenge, Click here to tweet it to the world, making yourself accountable and taking your goal to the next level!

Tweet to commit to the challenge!


The Why

TDLR: Build a data portfolio, learn, meet a community of awesome data people


Next Steps

If you’ve decided to join, here are the steps you need to go through:

  1. Tweet to commit to the challenge!
  2. Plan: Formulate what you want to work on during the challenge. It might involve learning a new type of visualization, learning how to add a narrative to a chart, or even mastering color choices.
  3. From today on, for the next 30 days, tweet your progress every day using the #30DaysOfDataViz hashtag. Others will comment on your data visualizations to help you learn and grow.
  4. Explore the #30DaysOfDataViz hashtag daily and comment on other people’s submissions. Help them understand what could make them better, or if they are great! Hint: Follow the #30DaysOfDataViz Twitter Bot to see other submissions.

Steps to increase the likelihood of success

  • (Optional, but highly recommended) Fork this Github repo and follow along with the example visualizations. Add your own flare to any code you see.
  • Follow 30DaysOfDataViz Twitter Bot that retweets all the tweets that contain the #30DaysOfDataViz hashtag. It’s a great way to keep yourself motivated and to participate in the community.
  • The goal is to create a beautiful data viz that communicates effectively. Use whatever tool you feel comfortable with. I’ll finish mine up in Google Sheets or Photoshop.
  • Important: Encourage others who are doing the same challenge on Twitter or elsewhere – give them props when they are post an awesome data visualization. It can be vulnerable to share, so a little encouragement goes a long way.
  • Reach out to others and introduce yourself! Once you land in the community, you’ll find that everyone is excited about learning and growth. Plus, the more people you meet, the more you’ll likely finish your goal
  • If you find a great, helpful resource that others would benefit from, either submit a Pull Request to add it to the repo, or just tweet at me (see info below)
  • Need help finding datasets? Check out SF Data, Kaggle Data, US Data, or DataQuest Datasets.

#30DaysOfDataViz – Examples

If you want to follow along with these exercises, clone this github repo, and check out the videos below.

Remember to add your own flare to these viz and share on twitter!

Day 1: Multi-Line Time Series – Youtube/Twitter/Code
Day 2: Scatter Plot w/ Color + Size (Matplotlib) – Youtube/Twitter/Code
Day 3: Animated Bar Race Chart (Using D3) – Youtube/Twitter/Code
Day 4: Violin Plot (Seaborn) – NY Ride Data – Youtube/Twitter/Code
Day 5: Dark Mode Scatter Plot (Matplotlib) – Youtube/Twitter/Code
Day 6: Ridge Plot (Seaborn) – Lebron James PTS – Youtube/Twitter/Code
Day 7: Multi-Line Retention Plot – Youtube/Twitter/Code
Day 8:
Day 9:
Day 10:
Day 11:
Day 12:
Day 13:
Day 14:
Day 15:
Day 16:
Day 17:
Day 18:
Day 19:
Day 20:
Day 21:
Day 22:
Day 23:
Day 24:
Day 25:
Day 26:
Day 27:
Day 28:
Day 29:
Day 30: Scatter Plot Animation – Youtube/Twitter/Code [Pandas, Seaborne]


If you have any questions or ideas about 30DaysOfDataViz (or other ideas), feel free to reach out to me on Twitter: @DataIndependent or check out DataIndependent.com