I. DATABYTES |
PERCENTAGE OF WOMEN EARNING GRADUATE DEGREES STAYS FLAT
From 2011 to 2015, more women graduated with a master’s degree or a doctoral degree from an engineering or engineering technology discipline. However, females as a percentage of total degrees awarded did not change significantly during this time period.
Table 1 shows the total number of women graduating with a graduate degree in an engineering or engineering technology discipline. From 2011 to 2015 there was a 27 percent increase in females awarded master’s degrees and a 23 percent increase in females awarded doctorates.
Table 2 shows the total number of graduate engineering and engineering technology degrees awarded to women in relation to all degrees awarded in engineering and engineering technology. For master’s degrees, there was a slight increase in the percentage awarded to women: from 23 percent to 25 percent from 2011-2015. For doctoral degrees, there was only a one percent increase during the same period. Although the number of women earning a graduate degree in engineering or engineering technology has increased, so has the number of men.
Table 1: Total Women Engineering and Engineering Technology Degrees by Level, 2011-2015
Table 2: Total Women Engineering and Engineering Technology Degrees as a Percentage of Total Engineering and Engineering Technology Degrees, 2011-2015
|
TOPˆ
|
|
LSU College of Engineering’s ‘Code IT Up Challenge’ Rewards Students for Learning Code
Can you hack it? That’s the question LSU’s College of Engineering is posing to high school students nationwide with the “Code IT Up Challenge.” The digital capture-the-flag-style game launched Oct.19, and registered players will have until Nov. 30 to compete for a one-time scholarship to study computer science at LSU.
The Challenge, created by a group of university alumni, invites players to use hints to code their way through various levels by collecting encrypted flags. As players master each level, they earn badges that can be shared on social media, and they can track their progress on the game’s leaderboard. The top three winners are given the cash prize.
“Gone are the days when a paper brochure was the best method for universities to recruit high school students,” said Recruiting Manager Andrew Osborn. “We had to think creatively about how we were reaching our prospective students and how the LSU College of Engineering could stand out among its competition.”
But the game goes beyond recruiting and entertainment, said Joshua Duke, who won the Challenge in the spring. It’s also educational.
“The best thing about taking part in the ‘Code IT Up Challenge’ was using the skills I had learned from my computer science classes,” he said, “and applying them to an interesting and unique challenge that tested my skills.”
Duke, who is now a computer science freshman, was among nearly 200 students from 22 states who participated in the first Challenge. And with exciting new levels, the College predicts it will see even more success this fall.
“Jump in and test yourself even if you have no coding experience,” Duke advised. Students of all skill levels—from computer novice to expert hacker—are encouraged to play. “You never know what you will learn—and you may even earn some scholarship money.”
The “Code IT Up” initiative is part of a partnership forged in 2013 by LSU, Louisiana Economic Development and IBM to increase interest in computer science by expanding recruiting efforts, course offerings, faculty hiring, and internship and job opportunities.
|
TOPˆ
|
|
|
|
III. POLITICAL HOTLINE |
A STUNNING LOSS FOR THE NERDS AND POLLSTERS
Hillary Clinton wasn’t the only person to suffer an unexpected defeat on Election Day. It proved to be a rough outing for quants and pollsters, as well. Stats-mavens like Princeton’s Sam Wang, 538’s Nate Silver and the New York Times’s Nate Cohn were among the many poll aggregators who erroneously forecast a rather easy victory for Clinton, with predictions of a Clinton presidency ranging from 66 to 99 percent. What went wrong? Columbia University statistician Andrew Gelman, writing in Slate,
says the probabilities were high because Clinton led in the polls for months. On a national level, averaged forecasts gave her a lead of between 3.3 to 4.6 percent. And in the end she did win the popular vote by 1.7 million votes, according to a recent count. So while the national polls were off, her winning total was within the margin of error.
But unfortunately for Clinton, we elect presidents using the Electoral College, based on state victories, not the overall popular vote. And some state polls were well off the mark, particularly those in Wisconsin, Michigan and Pennsylvania — three rust-belt states she was expected to win, and had she done so, would have given her the keys to the Oval Office. Polls showed her with comfortable leads in all three of them. But she lost them by paper-thin margins. A total of just 112,158 voters in the three states gave him the winning edge. In the end, Donald Trump was more successful than expected in gaining the votes of white, working-class, rural voters in those states and getting them to the polls. And while Clinton did well with women, minorities, and urban voters, they didn’t
turn out in numbers high enough to offset Trump’s gains with those white voters. How did the polls miss that? That’s not clear yet. Gelman suggests that 11th-hour decision-making may have played a big role. Exit polls indicate that Trump got 48 percent of last-minute deciders to Clinton’s 41 percent. Whatever the reason for the failures, CNN anchor Jake Tapper predicts they’ll put the polling and forecasting industry “out of business.” That’s unlikely, because the human desire to know what’s in our future is innate. One polling executive, Jon Cohen, tells the Washington Post
that the industry will certainly undergo a root and branch examination of what went wrong, with the focus on how to improve modeling likely voters. Says Cohen: “At the end of the day, tens of thousands, hundreds of thousands, or millions of people tagged as likely to vote didn’t bother to do so; and perhaps some deemed unlikely to get to the polls did vote.”
|
|
RESEARCHERS WARY OF INCOMING TRUMP ADMINISTRATION
President-elect Donald Trump, who has called climate change a hoax perpetrated by the Chinese, and has promised to “cancel” the international climate accord signed in Paris a year ago, has many leading scientists fearing for the worst when it comes to future research funding. For instance, Michael Lubell, a physics professor at the City University of New York and director of public affairs at the American Physical Society, told Nature
that Trump will be “the first anti-science president” in American history, and says the consequences of Trump’s presidency may be “very, very severe” for the science community. Some of the names being bandied about by Trump’s campaign for key cabinet posts won’t assuage those worries. He’s considering Myron Ebell, a leading climate skeptic, to head the Environmental Protection Agency. Indeed, Trump has also indicated that shuttering the EPA, or drastically curtailing its authority, is also in the cards. Oilman Harold Hamm, a personal friend of Trump, is considered the top candidate to head the Department of Energy. Hamm has said the federal government has done all it could in the past to stymie domestic oil and gas production. And the leading contender to head the Department of Homeland Security, which funds a fair amount of technology
research, is Milwaukee County Sheriff David Clarke, who has claimed that the protest movement Black Lives Matter has links to the terrorist group ISIS. Wired magazine notes that Trump has said little about his science policies, though he’s spoken positively about the space program, research that boosts industrial innovation and “the advancement of science and engineering in a number of fields.” But he’s also talked about cutting science funding to help pay for tax cuts.
|
|
|
TOPˆ
|
IV. TEACHING TOOLBOX
|
Gotta Catch 'Em All?
A global gaming sensation holds lessons for engineering educators that go well beyond technology fads.
By Aditya Johri
If you have paid any attention to the news lately, you have likely come across the phenomenon of Pokémon Go—a new game in which real people catch digital monsters in real places. Even I, an information technology researcher, found it hard to conceptualize the goals of the game, yet that’s exactly what occurs.
Pokémon Go takes common smartphone technologies, such as GPS, mapping, and satellite services, and combines them with location services, local landmarks, and Nintendo’s familiar characters to provide players an engaging augmented reality experience. I heard a commentator on NPR compare it to the Macarena, a dance craze that came and went very quickly. Another expert likened it to the ephemeral Ice Bucket Challenge.
Only time will tell if Pokémon Go is so transient a spectacle, but even so, the game has conclusively demonstrated the power of digital technology to capture imaginations and create an engaging platform from little pieces of information that come together in a digital-physical ecosystem. And this is a development that deserves serious attention from engineers and engineering educators. We live in an era of profound change. Engineering practices and products are permeated with a digital layer, and this digitization—whether applications, data, analysis tools, or collaborative media—is fundamentally changing the profession. Digitization also is shaking the very core of engineering identity by demonstrating the opportunities for making a positive change but also alerting us
to the limitations we face or have placed on ourselves.
We now have an opportunity and duty to reflect upon and review the world of engineers and engineering from a digital lens and ask ourselves: What are the possibilities, what is achievable, where would we like to go, and how can we get there? My goal in this column is to continually reflect on how the digitization of artifacts, data, interactions, and practices has the potential to shape what we accomplish as engineers and engineering educators.
What can we learn from Pokémon Go? For starters, the game demonstrates the first successful, large-scale amalgamation of the digital and physical worlds and showcases the “augmented reality” that we’ve been talking about for decades. Imagine what this ability to superimpose a digital layer can do for engineering—point your phone at a bridge and presto, you can receive all associated information about its structure and history and at the same time add, create, enhance, and contribute to that body of knowledge. Second, Pokémon Go invites anywhere, anytime interactions with others who may be very different from you and know a lot more (or less) about a subject. This interconnectedness not only makes an activity socially relevant but also allows for
knowledge generation and learning. If you look at that bridge and don’t find some information, you can share your experience and let someone else guide you.
For engineering educators, Pokémon Go’s singular strength is that it represents the collaboration of engineers (at Google) and artists (from Nintendo), working together to build on so much that has come before, from the characters to the data feeding the local PokéStops. It is truly an exemplar of the interdisciplinarity that engineering purports to thrive on but is hard to see in action. The digital layer provides a happy medium for ideas and products to gel. Moreover, the game demonstrates the advantage of understanding your users. If they are connected and comfortable with a “good enough” experience, they will engage with a game that is less than a full immersion as long as it easily fits with their everyday practices.
Pokémon Go’s novelty doesn’t lie in technology—that’s been available for a while—but in its melding of old and new to create a hybrid that didn’t exist before. Isn’t that the essence of learning? Differential equations may never achieve the cachet of capturing a lazy-looking Snorlax, but think of the deeper understanding students might acquire pursuing knowledge in an augmented-reality classroom. Where we go from here is up to us.
Aditya Johri is an associate professor in the department of information sciences and technology at George Mason University’s Volgenau School of Engineering and co-editor of the Cambridge Handbook of Engineering Education Research. |
TOPˆ
|
V. JEE SELECTS
|
Right Side of the Law
A nontraditional thermodynamics assignment offers a model for teaching engineering fundamentals.
By Peggy N. Van Meter, Carla M. Firetto, Stephen R. Turns, Thomas A. Litzinger, Chelsea E. Cameron and Charlyn W. Shaw
Improving students’ knowledge of foundational principles is a major concern of engineering educators. These principles are often complex, involving multiple interconnected knowledge elements. One particular challenge occurs in introductory thermodynamics, when many students struggle to understand the first law, or conservation of energy principle.
Our study shows that students’ knowledge of the first law can be improved by an educational intervention designed as a homework assignment. The goal of the intervention is to improve conceptual reasoning ability by helping students organize their knowledge and generate connections across associated knowledge elements. Although we focused on learning in thermodynamics, the design principles underlying the intervention can be applied to other engineering-related content that requires understanding of the organizational structure across knowledge elements and how connections should be drawn across that structure.
We named our intervention “Organization-Elaboration-Monitoring in Thermodynamics” (OEM-Thermo) because it was designed to stimulate cognitive operations involved in meaningful learning, i.e., organization, elaboration, and monitoring. Organization directs students to determine the structural relations of knowledge elements. Elaboration engages students in the generation of connections across new knowledge elements and between new and prior knowledge. Monitoring occurs when students consider the accuracy of developing knowledge. These cognitive operations can be stimulated by assigned tasks that direct students to think in particular ways.
OEM-Thermo uses three exercises to stimulate these cognitive operations. Exercises 1 and 2 ask students to complete a set of matrix notes. Rows of the matrix show three processes, such as constant pressure expansion. Ten different columns were labeled with prompts for supporting concepts: for example, relate P to V using ideal-gas equation of state EOS and/or process relations; sketch the P-V plot; and determine if work is in, out, or zero. Students organize knowledge into this structure by completing the matrix cells for each row-by-column combination. Exercise 3 stimulates elaboration and monitoring through a set of questions. Students compare cell entries across rows and down columns of the matrix to answer these questions; for example, they compare the magnitude of the work for each of
the three processes.
In addition to these exercises, OEM-Thermo uses a cognitive model to support students’ completion of Exercises 1 and 2. A cognitive model is a person, usually an expert, who verbalizes aloud the thoughts he has while thinking through a task. This model makes visible the ways of thinking about the content. In OEM-Thermo, the course instructor served as the cognitive model. Students watched videos that showed the model’s written work as he talked through how to think about cells of the matrix. The cognitive model completed part of the matrix, and students completed the rest independently.
We tested OEM-Thermo with 90 students from two introductory thermodynamics courses. Half of the students completed OEM-Thermo as a homework assignment; half completed traditional textbook-style problems that covered the same content. All students completed the Thermodynamics Conceptual Reasoning Inventory (TCRI) both before and after the assigned homework. The TCRI measures conceptual reasoning ability with fixed-mass, ideal-gas systems undergoing quasi-static (i.e., quasi-equilibrium) processes. Statistical analyses showed that students who completed OEM-Thermo had significantly greater improvement in TCRI scores than students who completed traditional homework problems. On average, students who completed OEM-Thermo had an increase of 20 percent between their pre- and post-test TCRI
scores. The average increase was only 3 percent for students who completed traditional homework problems.
This test of OEM-Thermo shows that a homework activity that stimulates the cognitive operations of meaningful learning can support the development of conceptual reasoning ability to a greater degree than traditional homework problems. The design of this intervention provides a structure for a homework assignment that could be adapted to fit foundational principles in a variety of engineering courses.
This research was conducted by an interdisciplinary team at Pennsylvania State University. At the time, Peggy Van Meter, Carla Firetto, Chelsea Cameron, and Charlyn Shaw were all members of the Educational Psychology program. Stephen Turns and Thomas Litzinger were engineering faculty members. This article is based on “Improving Students’ Conceptual Reasoning by Prompting Cognitive Operations” in the April 2016 Journal of Engineering Education. The work was supported with NSF grant TUES 1043833. |
TOPˆ
|
|
|
|