Gender Gap, Gender Graph

Gender Gap, Gender Graph

This challenge has been written by Theodora S. Tziampazi and is part of the EU CODE WEEK CHALLENGES.

Purpose of the challenge

  • Understand how data visualization can influence perception.
  • Identify biases in digital tools through interaction.
  • Experiment with data input to observe distortions.
  • Modify code to ensure accurate data representation.
  • Compare fair and biased data visualizations.
  • Reflect on the ethical implications of data manipulation.
  • Discuss real-world consequences of biased statistics.
  • Develop critical thinking about AI and algorithmic bias.

Duration

2 hours

Experience

Intermediate - Some basic coding knowledge is recommended; participants should be familiar with fundamental programming concepts.

Advanced - Designed for participants with strong coding skills and prior experience in programming.

Target Audience

  • Primary School students (6 to 12 years)
  • Lower Secondary School students (12 to 16 years)
  • Upper Secondary School students (16 to 18 years)
  • Teachers and educators

Gender Gap, Gender Graph

Primary School students (6 to 12 years)

Lower Secondary School students (12 to 16 years)

Upper Secondary School students (16 to 18 years)

Teachers and educators

Description of the challenge

Investigate bias in data visualization by inputting values, analyzing distortions, modifying code, and exploring how digital tools influence perceptions of gender representation in tech.

Instructions

You are the user of this digital tool (bar maker) https://scratch.mit.edu/projects/1147892829 . Do not look inside at the code yet. Click on the green flag and insert data (1-10) that hypothetically represent the number of women in a tech sector. Try with some numbers.

  • What do you notice?
  • Could this be a bug or a decision?
  • Either way, how can it be fixed?
Manually (user level):

Explore the tool and any draggable sprites.

  • Is there any position where the problem is solved?
  • Is there a position where another unexpected outcome is observed?
Food for thought/Discussion
  • What if the number of women in a sector is underestimated?
  • What if the number of women in a sector is overestimated?
  • What if the number of women in a sector is well-estimated but not balanced in terms of gender gap?
  • What should be done?

Takeaway: How we use the tool (where we situate a component) affects the result.

With coding (creator level):

Now, it’s time to see inside the project.

Simple challenge:

Hack the code so that the data presented are equal to the data inserted in any case.

Advanced challenge:

Copy the bar sprite and make it blue (stereotypically referred to male). Change the y position so that it is visible and comparable with the purple bar. Create a male symbol as a new sprite.

Tip:

You can download/export both sprites (bar, symbol) from the project given at the end of this section as a solution.

Now hack the codes so that when the user input is x (now given 2 times for each bar), the 2 bar makers are:

  • Fair (same code). Make adjustments in male symbol’s y position in if condition. Debug as you proceed checking both codes.
    • Unfair in use: Drag the 2 symbols in different positions; give the same input to see the inequality.
  • Sided/biased (different code)
    • in favor of men
    • in favor of women

Experiment so that the code would either mitigate or exacerbate the gender gap…

Explore the two ways the result is biased (dragging and/or coding).

How could these 2 ways be related to each other?

A solution (same code): https://scratch.mit.edu/projects/1151892036

Discussion:
  1. How do you feel that you can create a biased and an unbiased digital tool as well?
  2. Where is the bias more “hidden”? When it comes to dragging among different positions or to code?
  3. Can you imagine cases where a distorted image of something may help (some people-who)?
  4. Do you believe we could support distorted data when the end justifies the means or present the truth by no means?
  5. How do you feel that AI (driven by a given purpose) can try out things like you‘ve tried in this challenge?
More questions:
  • How do we collect data?
  • How is our data processed?
  • How are our actions/beliefs affected by statistics hidden in algorithms (AI or not)?

Takeaway: How we create the tool affects the result or even our worldview.

Share the link to your work in your Instagram Bio: tap the Edit Profile button on your Instagram and add the link to your work in the Website field. After that, create a new post, add a screenshot of your work, write “Link in Bio”, add the hashtag #EUCodeWeekChallenge and mention @CodeWeekEU.

Examples

  • Workplace Diversity Reports – Ensuring accurate representation of gender data in corporate diversity statistics.
  • Media & News Graphics – Avoiding misleading visualizations in reports on gender equality.
  • AI & Algorithm Bias – Identifying and mitigating biases in machine learning models processing demographic data.
  • Hiring & Recruitment Tools – Ensuring fair representation in HR analytics and decision-making software.
  • STEM Education & Outreach – Using unbiased data to encourage more women in tech fields.
  • Public Policy & Advocacy – Supporting fair policy decisions with accurate gender gap statistics.
  • Social Media & Awareness Campaigns – Creating fair visual representations of gender data to drive change.
Download this challenge as a word document