Last Friday, over one hundred people attended the Gender Equity in Technology Conference, or GETConf, at Metropolitan Community College’s Center for Advanced and Emerging Technology (CAET). The event sought to celebrate and amplify the voices of women and gender-expansive individuals in tech and featured a diverse cast of speakers, with topics ranging from designing against domestic violence to recognizing cultural bias in artificial intelligence.
Liz Dross, director of IT solutions development at Creighton University, was impressed by the quality of the speakers. She especially enjoyed Eva PenzeyMoog’s presentation on how companies must integrate anti-domestic violence safeguards into their technology products.
“I thought Eva’s talk was really an unusual one, something that we don’t normally think about,” Dross said. “I really appreciated her thoughts.”
GETConf was offered in partnership with Mystery Code Society, a 501(c)3 in Omaha that provides coding workshops for girls in fifth through twelfth grade. Rachel Fox, owner of Catapult Consulting Solutions and founder of the female empowerment organization You Go Girl, emceed the event.
Missed it? Silicon Prairie News has you covered. Here’s our GETConf recap:
Sally Elatta: Be Bold, Be Real, and Lead with Love
Sally Elatta, president of Agile Transformation, Inc., kicked off the day with a keynote about applying servant leadership to workplace management and product development. Companies need to anticipate the needs of customers, not just ask them what they want, Elatta said. She illustrated this concept with a humorous analogy: if Henry Ford had asked what people wanted, they would have said “faster horses.” He might have missed the chance to develop the world’s first mass-produced automobile.
Eva PenzeyMoog: Designing Against Domestic Violence
Next up, Eva PenzeyMoog, lead designer at Chicago software consultancy 8th Light, discussed the need for companies to anticipate how their technology might be used as weapons of domestic abuse, and to design against such unauthorized usages. PenzeyMoog shared five real-world examples of domestic violence involving tech misuse. For instance, a man started changing the settings on his smart home (lights, locks, etc.) as a way to gaslight and intimidate his partner; he was also able to use Alexa as an eavesdropping device. Such misuses often arise from instances when one partner controls access to tools that both partners need to use, such as banking apps and smart home technology.
Companies need to keep three things in mind when developing new tech:
- Include reality of domestic violence in the research process, giving people a chance to answer questions anonymously.
- Imagine scenarios for domestic abuse. Then design against them.
- Identify chances for safe, meaningful interaction.
Noni Williams: Data Science Is (And Should Be) Intersectional
Noni Williams, Manager of Solutions and Continuous Improvement at United Way of the Midlands, discussed data science as an intersectional force that can help organizations understand their business problems, create buy-in for change, and build trust in the organization and its mission. To build trust, for instance, companies should be mindful of the identities and perspectives of the people analyzing the data. One person might discount or overlook a variable that someone else considers important. “You don’t know what you don’t know,” Williams said. “You don’t know what you’re missing, and you don’t know what’s relevant if you don’t have that experience.”
Amy Newell: Suffering, Authenticity, Productivity (Lessons I Learned from Bipolar Disorder)
Amy Newell, director of engineering at Wistia, spoke about creating authentic, humane workplace cultures. She used her struggles with bipolar disorder and anxiety as a springboard to talk about the general suffering that employees experience, and how organizations need to acknowledge and adjust to that suffering rather than ignore or add to it. Newell cited a DeLoitte study that found 61 percent employees are “covering” at work, or hiding their true identities. Whether that means mental illness, sexual orientation, gender identity, or any other disfavored identity, covering doesn’t change reality, it just makes everything worse.
“I don’t want demotivated engineers who are covering in the workplace,” Newell said.
Instead, companies should remove barriers to authenticity. Such barriers include:
- Thinking feelings are unimportant or inappropriate in the workplace.
- Thinking people’s struggles outside of work don’t or shouldn’t affect their work.
- Dominance structures that make people feel anxious and judged.
Newell said that tech workplaces are especially at-risk of inauthenticity, often demanding more and more out of programmers while simultaneously pressuring them not to admit mistakes or tiredness. Such an approach is unproductive and unsustainable.
“Do not stress people out until they break,” Newell said. “And if you can, don’t work for companies that don’t value their employees.”
Camille Eddy: Recognizing Cultural Bias in AI
Camille Eddy, robotics engineer & developer at TIMBER IT Solutions, spoke about identifying and reducing unwanted bias in machine learning. She discussed how culturally biased datasets can introduce unwanted bias into artificial intelligence. For instance, when researchers at Boston University and Microsoft Research used a Google-developed word database called Word2vec to complete the analogy of “man is to computer programmer as woman is to X,” the database evaluated X as “homemaker”—meaning any AI that learns from this database will inherit a sexism that humans would like to see eradicated, not reinforced.
“We must ensure that the world we live in today, with all its -isms…is not the world that artificial intelligence interprets for us tomorrow,” Eddy said.
Engineers should be aware of their own blind spots and filter bubbles and avoid building those into their algorithms. This can be addressed through what Eddy called the FRAT principle: Fairness, Responsibility, Accountability, and Transparency.
After lunch, participants divided into different technical workshops: Research Meets Reality, by NPR Product Designer Katie Briggs; Fail Faster: Quick UX Design Techniques to Drive Toward Success, by Union Pacific UX advocate Ash Banaszek; and A hands-on introduction to Natural Language Processing in Python by data scientist Grishma Jena. Participants also had the option of taking a guided tour of MCC’s new Center for Advanced and Emerging Technology.
Panel Discussion: Strengthening the STEM Pipeline
Following afternoon workshops, attendees reconvened to observe a panel discussion between area leaders in STEM education. Panel members included Gauri Ramesh, co-founder of Girls Code Lincoln; Victoria Novak, director of workforce and IT innovation at MCC; Eris Koleszar, co-founder of Mystery Code Society; Angela McGraw, iSTEM coordinator at the University of Nebraska at Omaha; Stefanie Monge, digital recruiter at UNO and GETConf vice chair; and Tiffany Gamble, founder of Emerging Ladies Academy.
Some highlights from the discussion included the power of offering women role models in the classroom, being culturally sensitive, dispelling misperceptions about the nature of coding, and removing barriers to success (such as malnutrition).
“You can’t teach a girl coding if she’s hungry,” Gamble said.
Jennifer Wong: Empathetic Design Systems
After the panel discussion, software engineer and writer Jennifer Wong spoke about the need for empathy in design systems (aka component libraries or style guides). Wong said that many design and UX challenges boil down to a lack of empathy. Teams should keep their end-user in mind at all times, and avoid a “throwing-the-ball-over-the-wall” approach to design.
Nicole Damen: The power of asking questions: How to go from vague answers to meaningful stories
Nicole Damen, a Ph.D candidate in the human-centered computing program at UNO, discussed the need for better survey design. For instance, surveys should never ask compound or “double-barreled” questions. Damen gave the following example: if you asked someone, “Are you going to buy an HBO subscription and watch Game of Thrones?” you have no idea which part of the question a respondent is saying yes or no to, and any answers generated would be useless.
“If their answer doesn’t make sense to you, your question didn’t make sense to them,” Damen said.
Jennifer Wadella: Hacking your work-life balance to take over the world
“In technology, our brains are obviously the biggest asset we have,” Wadella said. “Burnout is real. Burnout is killing us.”
The ultimate takeaway from Wadella’s keynote? Do whatever you can to build a life you don’t regularly want to escape from. Also, run errands when parents are picking up their kids from school.
Tom McCauley is digital content producer at AIM Institute and a terrible standup comedian.