Sunday, February 26, 2017

What CS Departments Do Matters: Diversity and Enrolment Booms

I've written before about the historical factors that have led to the decline in the percentage of women in CS. The two enrolment booms of the past (in the late-80s and the dot-com era) both had large impacts on decreasing diversity in CS. During enrolment booms, CS departments favoured gatekeeping policies which cut off many "non-traditional" students; these policies also fostered a toxic, competitive learning environment for minority students.

We're in an enrolment boom right now so I --- along with many others --- have been concerned that this enrolment boom will have a similarly negative effect on diversity.

Last year I surveyed 78 CS profs and admins about what their departments were doing about the enrolment boom. We found that it was rare for CS departments to be considering diversity in the process of making policies to manage the enrolment boom.

Furthermore, in a phenomenographic analysis of the open-ended responses, I found that increased class sizes led many professors to feel their teaching is less effective and is harming student culture (this hasn't been published yet --- but hopefully soon!)

Around the same time I put out my survey, CRA put out a survey of their own on the enrolment boom. Their report has just come out; they have also found that few CS departments are considering diversity in their policy making --- and that the departments who have been considering diversity have better student diversity.

From CRA's report:

The Relationships Between Unit Actions and Diversity Growth


The CRA Enrollment Survey included several questions about the actions that units were taking in response to the surge. In this section, we highlight a few statistically significant correlations that relate growth in female and URM students to unit responses (actually, a composite of several different responses).

1.    Units that explicitly chose actions to assist with diversity goals have a higher percentage of female and URM students. We observed significant positive correlations between units that chose actions to assist with diversity goals and the percentage of female majors in the unit for doctoral-granting units (per Taulbee 2015, r=.19, n=113, p<.05), and with the percent of women in the intro majors course at non-doctoral granting units (r=.43, n=22, p<.05). A similar correlation was found for URM students. Non-MSI doctoral-granting units showed a statistically significant correlation between units that chose actions to assist with diversity goals and the increase in the percentage of URM students from 2010 to 2015 in the intro for majors course (r=.47, n=36, p<.001) and mid-level course (r=.37, n=38, p<.05). Of course, units choosing actions to assist with diversity goals are probably making many other decisions with diversity goals in mind. Improved diversity does not come from a single action but from a series of them

2.    Units with an increase in minors have an increase in the percentage of female students in mid- and upper-level courses. We observed a positive correlation between female percentages in the mid- and upper-level course data and doctoral-granting units that have seen an increase in minors (mid-level course r=.35, n=51, p<.01; upper-level course r=.30, n=52, p<.05). We saw no statistically significant correlation with the increased number of minors in the URM student enrollment data. The CRA Enrollment Survey did not collect diversity information about minors. Thus, it is not possible to look more deeply into this finding from the collected data. Perhaps more women are minoring in computer science, which would then positively impact the percentage of women in mid- and upper-level courses. However, units that reported an increase in minors also have a higher percentage of women majors per Taulbee enrollment data (r=.31. n=95, p<.01). Thus, we can’t be sure of the relative contribution of women minors and majors to an increased percentage of women overall in the mid- and upper-level courses. In short, more research is needed to understand this finding.

3.    Very few units specifically chose or rejected actions due to diversity. While many units (46.5%) stated they consider diversity impacts when choosing actions, very few (14.9%) chose actions to reduce impact on diversity and even fewer (11.4%) decided against possible actions out of concern for diversity. In addition, only one-third of units believe their existing diversity initiatives will compensate for any concerns with increasing enrollments, and only one-fifth of units are monitoring for diversity effects at transition points.

From a researcher's perspective this has me happy to see: we used very different sampling approaches (they surveyed administrators, I surveyed professors in CS ed online communities), we used different analytical approaches (their quantitative vs. my qualitative), and we came to the same conclusion: CS departments aren't considering diversity. This sort of triangulation doesn't happen every day in the CS ed world.

CRA's report gives us further evidence that CS departments should be considering diversity in how they decide to handle enrolment booms (and admissions/undergrad policies in general). If diversity isn't on policymakers' radars, it won't be factored into the decisions they make.

Saturday, February 25, 2017

Computer Science for Future Leaders

There's a great physics course out there called Physics for Future Presidents. For some time I've been mulling over what a Computer Science for Future Presidents (and Prime Ministers) would look like.

Last week I taught an introduction to online safety to a group of political activists (experience report here). Along the way I taught a lot of introductory computer science and saw opportunities to cover even more.

I've taught a number of introductory CS classes that are introductions to programming. Like a lot of computer scientists I appreciate coding as an important tool in CS, but don't like how so many students walk out of their first (and potentially only) CS class with the idea that CS == programming. Computational thinking classes make for a good step away from this misconception but still don't cover all the things I'd want future world leaders to know.

The internet and cybersecurity makes a great way to introduce computing --- and to cover what future world leaders need to know about computer science.

This is what I'd cover in a 12 week course. This course would complement an introduction to programming and the two could be taken concurrently.


Computer Science for Future Leaders

  1. Introduction to the course. Searching and sorting, and big O notation. I'd introduce binary and linear search, and insertion, selection, and merge sorts. Motivate searching/sorting as necessary for internet computing (indeed, 25% of the world's CPU time is estimated to be spent on sorting tasks.) Quick review of logarithms.
  2. Symmetric key encryption. How to encrypt, some approaches for breaking encryption (build on searching/sorting from last week). Big-O of encryption/decryption algorithms.
  3. Graph theory. Define edges/vertices. How to find a shortest path over a network, minimum spanning trees. Talk about costs on networks, congestion, resilience/redundancy. Talk about where you'd want to eavesdrop on a network for maximum coverage. Big-O of relevant graph algorithms.
  4. Early communication networks. Talk about how telegraphs worked, how data was encoded. Talk about pre-wireless phone networks and how that data is encoded. Introduce some coding theory: error detection and correction over networks.
  5. What is a file? Character encoding, numerical representation, file encodings. Code lives in files too: HTML as example. What is a file system?
  6. Midterm. What is a computer? Early computers; command-line interfaces.
  7. Pre-internet computer networks. Talk about packets, packet routing, packet switching. How routers work.
  8. Internetworking: how we can connect networks together. Internet infrastructure (ISPs, IXPs, etc), TCP/IP, DNS. Who governs the various components of the internet (ICANN, RIRs, etc).
  9. Asymmetric key cryptography. Why it was necessary for the internet to grow in popularity. Whitfield-Diffie, RSA, PGP. P and NP.
  10. Secure internetworking. SSL, HTTPS, TOR, VPNs, etc. Cookies. How internet surveillance and censorship work. Cyberwarfare. Dangers of online/computerized voting.
  11. Social networking. How social network websites work. What is their business model? AI and machine learning on the internet, filter bubbles and other biases resulting from machine learning.
  12. HCI of the internet. Usability issues on the internet. HCI approach to security: who is in your personal network and how can you stay safe?

The whole course covers a lot of computer science: algorithms, theory of computation, systems, networking, crypto, security, HCI, AI. You could add in a bit on databases if you wanted, too.

Some big advantages of this approach to introducing computer science are:
  • Students get a more accurate feel for what computer science is and what it's about than in an introductory programming course.
  • Students see computer science as a human endeavour. It's history is exposed, as well as motivations for the major stages in its development.
  • Similarly, students see how CS is not value neutral. We discuss topics like neocolonialism in technology development, the role of the military in advancing computer science, how the internet is governed, and how the internet affects politics.
  • Students learn about computer security and the internet that is useful to their daily lives in a way that empowers them. 
  • Improving the state of our democracy. We need leaders and community members to understand these issues to make informed decisions.

Introducing Computer Science via Online Security: An Experience Report

Last weekend I spent two hours teaching an informal introduction to online security to an audience of political activists. I wound up teaching a fair bit of computer science in the process and I'm writing up this experience report because I think it's a valuable way to teach introductory computer science.

Before I put together my lesson plan I spent a fair bit of time looking at other people's introductions. Broadly, they fell into two categories:
1. Introductions for CS students, which would include things like how to write your own HTTPS server or proofs about why RSA works (too advanced for my audience)
2. Instructions for what software you should download to stay secure.

I'm a member of the political organization from which my audience came. People regularly post articles which fall into category 2 on the online community for the group. And not unsurprisingly, these articles have had limited effects on getting people to change their behaviour. This was why I'd volunteered to teach the workshop. I'd initially planned it to be all about the software to install to stay safe.

As I put together my lesson plan I had a change of idea for the goal of the workshop. In my experience teaching introductory programming, students struggle for the first few weeks because they don't understand why they should be learning this or what it gets them. I started to think something similar might be going on here: a typical article telling you to install Signal and HTTPS Everywhere doesn't sufficiently motivate why it's necessary and what's going on technically.

Computer scientists like myself think of the internet in a very different way than my activist friends. My activist friends see the internet as a mystical black box.

My learning goal for the workshop hence became: to demystify the internet.

Thursday, June 2, 2016

"Helping" women in CS with impostor syndrome is missing the forest for the trees

Alexis Hancock recently wrote an article on impostor syndrome that has been on my mind ever since, as it adds so nicely to a blog post I wrote several months ago. I wanted to try and explain why so many women have impostor syndrome in CS:
Sociologists like to use performance as a metaphor for everyday life. Erving Goffman in particular championed the metaphor, bringing to light how our social interactions take place on various stages according to various scripts. And when people don't follow the right script on the right stage, social punishment ensues (e.g. stigma).  [...]

Since not following the script/game is costly for individuals, we're trained from a young age to be on the lookout for cues about what stage/arena we're on and what role we should be playing. [...]

Impostor syndrome is the sense that you're the wrong person to be playing the role you're in. You're acting a role that you've been trained in and hired for -- but your brain is picking up on cues that signal that you're not right for the role.

When [people] go on to play roles [they haven't been raised for], they still sometimes encounter social cues indicating they're in the wrong role. Impostor syndrome results.

Impostor syndrome is thought to be quite common amongst women in science. In this light I don't think it's surprising: there are so many cues in society that we are not what a 'scientist' is supposed to look or act like. We don't fit the stereotypes.

I'm far from the first person to argue that impostor syndrome comes from environmental cues. What Hancock's article does is point out the contradiction: impostor syndrome has environmental causes, but is talked about as being an individual's personal problem.

[While struggling with impostor syndrome] I became consumed with proving myself. Still, all the advice I received came in the form of a pep talk to “believe in myself” again. This common response to the struggles of women in tech reinforces the idea that imposter syndrome is the ONLY lens to view and cope… but the truth is, our negative experiences in tech are usually outside of our control. The overwhelming focus on imposter syndrome doesn’t provide a space to process the power dynamics affecting you; you get gaslighted into thinking it’s you causing all the problems.

Similarly, Cate Hudson writes that:
Yet imposter syndrome is treated as a personal problem to be overcome, a distortion in processing rather than a realistic reflection of the hostility, discrimination, and stereotyping that pervades tech culture. [...] Assuming that it’s just irrational self-doubt denies potentially useful support or training. Most of all, chalking up myriad factors to such an umbrella term belies the need to explore where these concerns arise from and how they can be addressed or mitigated. Subtle or not-so-subtle undermining behavior by colleagues? Gendered feedback? Lack of support or mentorship? [...] We pretend imposter syndrome is some kind of personal failing of marginalized groups, rather than an inevitability and a reflection of a broken and discriminatory tech culture.

So many well-intentioned diversity efforts in computer science focus on impostor syndrome and try to help women cope with it. But that discourse treats the women who have impostor syndrome as though they have an individual problem. The effect can silence women: instead of seeing their negative environment as a structural issue, they blame themselves.

Those of us who want to get more women into CS need to stop telling women that they suffer from impostor syndrome and instead help them see environment they're in. The social cues that are affecting them need to be identified and mitigated. And we need to stop teaching women to blame themselves for the sexism around them.

Monday, March 21, 2016

A Seven-Step Primer on Soft Systems Methodology

I'm currently TAing for CSC2720H Systems Thinking for Global Problems, a graduate-level course on systems thinking. In class today we talked about soft systems thinking (SSM), an approach which uses systems thinking to tackle what are called "wicked problems". I thought I'd outline one approach to SSM, as it's useful to CS education research.


Step 1: Identify the domain of interest

Before you can research something, you should first decide what your domain is. What topic? What system are you studying? For example, "teaching computer science" could be your starting point, as could "climate change".

Chances are you're looking at a wicked problem. Conklin's definition of wicked problems are that:
  1. The problem is not understood until after the formulation of a solution.
  2. Wicked problems have no stopping rule.
  3. Solutions to wicked problems are not right or wrong.
  4. Every wicked problem is essentially novel and unique.
  5. Every solution to a wicked problem is a 'one shot operation.'
  6. Wicked problems have no given alternative solutions.
Because you're looking at a domain which doesn't have a clear definition or boundaries, you'll first want to immerse yourself in the domain. One trick is to draw "rich pictures", which are essentially visualized streams of consciousness.

You should also think about what perspectives you bring into this domain. What biases and privileges do you have going into this? Why are you interested in this domain? What do you have to gain or lose here?

Monday, March 14, 2016

"'Women in Computing' As Problematic": A Summary

I've long been interested in why, despite so much organized effort, there percentage of women in CS has been so stagnant. One hypothesis I had for some time was that the efforts themselves were unintentionally counter-productive: that they reinforced the gender subtyping of "female computer scientist" being separate from unmarked "computer scientists".

I was excited earlier this week when Siobhan Stevenson alerted me to this unpublished thesis from OISE: "Women in Computing as Problematic" by Susan Michele Sturman (2009).

In 2005-6, Sturman conducted an institutional ethnography of the graduate CS programmes at two research-intensive universities in Ontario. In institutional ethnography, one starts by "reading up": identifying those who have the least power and interviewing them about their everyday experiences. From what the interviews reveal, the researcher then goes on to interview those identified as having power over the initial participants.

Interested in studying graduate-level computer science education, she started with female graduate students. This led her to the women in computing lunches and events, interviewing faculty members and administrators at those two universities. She also attended the Grace Hopper Celebration of Women in Computing (GHC) and analysed the texts and experiences she had there. Her goal was to understand the "women in computing" culture.

In the style of science studies scholars like Bruno Latour, Sturman comes to the organized women in computing culture as an outsider. As a social scientist, she sees things differently: "Women in the field wonder what it is about women and women's lives that keeps them from doing science, and feminists ask what it is about science that leads to social exclusion for women and other marginalized groups" 

Friday, February 19, 2016

Getting Fedora 23 working on an Asus Zenbook UX305CA (Intel Skylake)

I recently acquired a shiny new Asus Zenbook UX305CA to replace my old UX32A which had been dying a slow death for the past year.

Excitedly, I put the latest Fedora release (23) on the computer, using the Cinnamon spin. While the computer ran Fedora, the screen resolution was set at 800x600 with no other options.

The issue? The Intel Skylake chip in the computer wasn't supported by the kernel that Fedora 23 ships with (kernel version 2.3). Like many linux users with new laptops I've found myself in a bit of an adventure with the new skylake chip. I thought I'd write up how I eventually got Fedora 23 working on this computer for the sake of those following the same path.

To get linux working with kernel 2.3, I found the Arch Wiki invaluable:
  • I needed the kernel boot argument: i915.preliminary_hw_support=1
  • And then you set xorg.conf as described in the Arch Wiki

Once both of those were done my computer was working, but without hardware acceleration. The next step was to install kernel 4.4, which supports Skylake.
  • You'll want to add the repository where Fedora keeps the latest kernel versions: I found 4.4 in kernel-vanilla-stable (see instructions here)
  • Then, once I tried booting with kernel-4.4, I got an error at boot: "double free at 0x(address) Aborted. Press any key to exit". To get rid of the error, I found I had to temporarily disable the validation steps of the new kernel as described in comment 18 on the bugzilla report
  • The mokutil utility will ask you to set a password for altering safe boot. Write it down. When you reboot it will ask for the password on a character by character basis, where the order of the characters is random. I wound up failing this the first time because I assumed the password should be 0-indexed; it's actually 1-indexed.
  • Once I had insecure boot turned on, I could successfully boot kernel-4.4! But cinnamon informed me that software rendering was still on. To solve this, I had to undo what I'd done to make kernel-4.2 work: take out the i915.preliminary_hw_support=1 and set xorg.conf to what is recommended for Intel graphics in general rather than the Skylake bandaid (you just take out the options line).

Once all that was done, the computer's working quite nicely!