Monthly Archives: May 2017

What are software vulnerabilities, and why are there so many of them?

Thomas Holt, Michigan State University

The recent WannaCry ransomware attack spread like wildfire, taking advantage of flaws in the Windows operating system to take control of hundreds of thousands of computers worldwide. But what exactly does that mean? The Conversation

It can be useful to think of hackers as burglars and malicious software as their burglary tools. Having researched cybercrime and technology use among criminal populations for more than a decade, I know that both types of miscreants want to find ways into secure places – computers and networks, and homes and businesses. They have a range of options for how to get in.

Some burglars may choose to simply smash in a window or door with a crowbar, while others may be stealthier and try to pick a lock or sneak in a door that was left open. Hackers operate in a similar fashion, though they have more potential points of entry than a burglar, who is typically dependent on windows or doors.

The weaknesses hackers exploit aren’t broken windowpanes or rusty hinges. Rather, they are flaws in software programs running on a computer. Programs are written by humans, and are inherently imperfect. Nobody writes software completely free of errors that create openings for potential attackers.

What are these flaws, really?

In simple terms, a vulnerability can be an error in the way that user management occurs in the system, an error in the code or a flaw in how it responds to certain requests. One common vulnerability allows an attack called a SQL injection. It works on websites that query databases, such as to search for keywords. An attacker creates a query that itself contains code in a database programming language called SQL.

If a site is not properly protected, its search function will execute the SQL commands, which can allow the attacker access to the database and potentially control of the website.

Similarly, many people use programs that are supported by the Java programming language, such as Adobe Flash Player and various Android applications. There are numerous vulnerabilities in the Java platform, all of which can be exploited in different ways, but most commonly through getting individuals to download “plug-ins” or “codecs” to software. These plug-ins actually contain malicious code that will take advantage of the vulnerability and compromise the machine.

Flaws are everywhere

Vulnerabilities exist in all types of software. Several versions of the Microsoft Windows operating system were open to the WannaCry attack. For instance, the popular open-source web browser Firefox has had more than 100 vulnerabilities identified in its code each year since 2009. Fifteen different vulnerabilities have been identified in Microsoft Internet Explorer browser variants since the start of 2017.

Software development is not a perfect process. Programmers often work on timelines set by management teams that attempt to set reasonable goals, though it can be a challenge to meet those deadlines. As a result, developers do their best to design secure products as they progress but may not be able to identify all flaws before an anticipated release date. Delays may be costly; many companies will release an initial version of a product and then, when they find problems (or get reports from users or researchers), fix them by releasing security updates, sometimes called patches because they cover the holes.

But software companies can’t support their products forever – to stay in business, they have to keep improving programs and selling copies of the updated versions. So after some amount of time goes by, they stop issuing patches for older programs.

Not every customer buys the latest software, though – so many users are still running old programs that might have unpatched flaws. That gives attackers a chance to find weaknesses in old software, even if newer versions don’t have the same flaws.

Exploiting the weaknesses

Once an attacker identifies a vulnerability, he can write a new computer program that uses that opportunity to get into a machine and take it over. In this respect, an exploit is similar to the way burglars use tools like crowbars, lock picks or other means of entry into a physical location.

They find a weak point in the system’s defenses, perhaps a network connection that hasn’t been properly secured. If attackers can manage to gain contact with a target computer, they can learn about what sort of system it is. That lets them identify particular approaches – accessing specific files or running certain programs – that can give them increasing control over the machine and its data. In recent years, attackers began targeting web browsers, which are allowed to connect to the internet and often to run small programs; they have many vulnerabilities that can be exploited. Those initial openings can give an attacker control of a target computer, which in turn can be used as a point of intrusion into a larger sensitive network.

Sometimes the vulnerabilities are discovered by the software developers themselves, or users or researchers who alert the company that a fix is needed. But other times, hackers or government spy agencies figure out how to break into systems and don’t tell the company. These weaknesses are called “zero days,” because the developer has had no time to fix them. As a result, the software or hardware has been compromised until a patch or fix can be created and distributed to users.

The best way users can protect themselves is to regularly install software updates, as soon as updates are available.

Thomas Holt, Associate Professor of Criminal Justice, Michigan State University

This article was originally published on The Conversation. Read the original article.

Star Wars turns 40 and it still inspires our real life space junkies

Jonathan Roberts, Queensland University of Technology

It was 40 years ago today, on May 25, 1977, that Star Wars first burst onto cinema screens, and from that time the world changed for the better. The Conversation

Star Wars introduced the world to Jedi knights with lightsabers, an evil empire building a moon size planet killer weapon, a rebel alliance with X-wing fighters and countless cool droids that were often smarter than their owners.

The original trailer.

Quite why Star Wars was such a massive hit has been debated ever since. It was clearly not for the dialogue.

It was probably due to the fast-paced action. In fact, Star Wars popularised the notion that some films do not need opening credits, just an opening crawl to set the scene.

Director George Lucas wanted the action to start as soon as the film did, and for audiences to be engrossed from the first few seconds.

Star Wars was re-released in 1981 as a Episode IV: A New Hope.

What made Star Wars different to the already loved Star Trek TV series was that Star Wars was not a prediction of our human future. Instead it was a story set in another galaxy in the ancient past.

Some of us had our lives and careers shaped by Star Wars, and by longing to create the things we saw when we were young.

Forty years on, who and what has been shaped by this revolutionary movie?

Space technology

The first Star Wars film was revolutionary in its depiction of high-speed battles between spaceships.

Star Wars, one of the original film posters.
20th Century Fox/IMDB

The dog fights around the Death Star seemed so realistic, even though it was not obvious how some of the spaceships actually manoeuvred so well.

When I took spacecraft design courses at university in the late 1980s (as part of my undergraduate degree), I did not dream that fellow Star Wars fans might one day be influential enough to actually design real spacecraft.

We were taught that bringing a rocket back to Earth from space was impossible. I now realise that my lecturers were probably not Star Wars fans.

The billionaire inventor and entrepreneur Elon Musk is one of those millions of mega Star Wars fans. He says that Star Wars was the first movie that he ever saw, and from that he has had an obsession with space travel and for turning humans from a single planet species into a multi-planet civilisation.

In 2002, Musk created the Space Exploration Technologies Corporation, better known as SpaceX, with the stated aim of creating spacecraft to regularly fly hundreds of humans to and from Mars.

To Mars.

Musk named his series of rockets “Falcon”, after Han Solo’s Millennium Falcon. And in 2017, a Falcon rocket became the first orbital class booster to return from space, land and later re-fly back into space.

A SpaceX Falcon 9 lands after returning from space. Falcon rockets are named after the Millennium Falcon from Star Wars.

In 2000, fellow billionaire inventor Jeff Bezos started his rocket and spaceship company Blue Origin off the back of his success creating Amazon. His New Shepard rocket was the first suborbital booster to return from space, land and later re-fly back into space.

New Shepard rocket: takeoff and landing.

Bezos is more of a Trekkie. He is so obsessed with Star Trek that he has even acted in it, appearing as an alien in the 2016 movie Star Trek Beyond.

At this point, the Star Wars mega-fan (Musk) is ahead of the Trekkie (Bezos) in delivering commercial space flight with reused rockets. But only time will tell who will win.

Speeders

Star Wars introduced us to the Landspeeder. This is the car-like vehicle that Luke Skywalker uses to get to and from the family moisture farm, and which he sells so he can part-pay Han Solo to fly with him to the Alderaan system.

Luke’s X-34 landspeeder is very much like a hovercraft that did exist long before Star Wars. But hovercraft are noisy and kick up a lot of dust, which is not great in the desert driving situations encountered on Tatooine!

In 1978, a toy landspeeder was the must have toy, and I was lucky enough to have one. I still have it of course. The way it appeared to float across the floor on its highly sprung and hidden wheels was brilliant design.

Subsequent Star Wars films such as Return of the Jedi showed us speeder bikes, and since then engineers have tried to replicate these amazing vehicles.

Some great engineering efforts include the Jetovator speeder bike that works over water and connects to a jet ski. The makers were clearly inspired by Star Wars.

The Jetovator speeder bike works on water.

Others have recently created and tested hoverbikes that if they were fully commercialised would be very close to the speederbikes of Star Wars.

Colin Furze’s homemade hoverbike.

One group have even made a speeder, the Aero-X, to test in the desert to ensure that Luke would be able to use it if need be.

Remember Luke Skywalker’s Landspeeder in Star Wars?

Droids

But for me, it was the droids of Star Wars that had the greatest impact. There can be no greater pair of onscreen robots as R2-D2 and C-3PO. They were perfect.

Some original Star Wars scenes have been recreated in LEGO (with some modification).

I have written before about Star Wars and robots. The vision that George Lucas and his team had in creating these robots (and the others that are found in the original 1977 movie) has had a major impact on robotics development, by inspiring many current day roboticists.

We are beginning to see real high quality automatic translation services – something C-3PO was designed to do. We have medical robots, military robots and even farm robots.

All of these were shown in Star Wars. Our present-day robots are not as capable as the Star Wars robots, but us roboticists are working hard to make that happen.

New fans

It is unlikely that any film in the future will be as surprising as Star Wars was. It was new and exciting and surely that is one of the reasons for its success.

But yet there are new Star Wars fans being born every day. It helps that many of their parents and grandparents are possibly also Star Wars fans, and that at the moment there is a new Star Wars film out every year.

If the love of Star Wars is handed down the generations then who knows what it will have inspired in another 40 years’ time.

The next Star Wars film is due out in December this year.

Jonathan Roberts, Professor in Robotics, Queensland University of Technology

This article was originally published on The Conversation. Read the original article.

Designing games that change perceptions, opinions and even players’ real-life actions

Lindsay Grace, American University School of Communication

In 1904, Lizzie Magie patented “The Landlord’s Game,” a board game about property ownership, with the specific goal of teaching players about how a system of land grabbing impoverishes tenants and enriches property owners. The game, which went on to become the mass-market classic “Monopoly,” was the first widely recognized example of what is today called “persuasive play.” The Conversation

‘The Landlord’s Game,’ the first persuasive game.
U.S. Patent and Trademark Office

Games offer a unique opportunity to persuade their audiences, because players are not simply listening, reading or interpreting the game’s message – they are subscribing to it. To play a game, players must accept its rules and then operate within the designed experience. As a result, games can change our perceptions, and ultimately our actions.

In American University’s Game Lab and Studio, which I direct, we’re creating a wide range of persuasive games to test various strategies of persuasion and to gauge players’ responses. We have developed games to highlight the problems with using delivery drones, encourage cultural understanding and assess understanding of mathematics.

Teaching math with a video game: Aim the laser with mathematical formulas.

And we’re expanding the realm beyond education and health. With support from the Knight Foundation, we’ve been researching ways to connect games and journalism to engage people more deeply with issues in the news. (The Knight Foundation has also funded The Conversation US.) Our newest game, helping people and news organizations distinguish between real news and fake reports, is out now.

Game play involves action

When talking about games as a persuasive tool, I often repeat the notion that readers read, viewers watch and players do. It’s not a coincidence that when sitting down to learn a new game, a prospective player most often asks, “What do you do?” Persuasive play offers people the opportunity to do more than merely read or watch – they can engage with the game’s subject matter in fundamentally valuable ways.

In our work, we want to enhance people’s understanding of complex topics, change their perspective and encourage them to think critically about the world around them.

For example, Game Lab faculty member Bob Hone worked with the National Institutes of Mental Health to create a game that is now in clinical trials as a treatment for anxiety without medication. The game, “Seeing the Good Side,” asks players to find numbers hidden in detailed drawings of classroom scenes. In the process, players practice calming themselves by looking around an entire space rather than focusing on one person’s anxiety-provoking angry face.

Games can relieve stress in other ways, too. A recent study we conducted with Educational Testing Service found that a game we created to replace multiple choice standardized tests offered a more positive test-taking experience for students. In addition, students were better able to demonstrate their abilities.

Turning to the news

Persuasive play is most common in education and health, but it’s becoming more common in other fields too. We’ve been working on using game design techniques to get people to engage with the news. We’ve also proposed adapting lessons from gaming to attract and retain audiences for news websites.

A game about reality: The beginning of a commuter’s day starts with an alarm clock.
American University Game Lab, CC BY-ND

One project we did involved creating a game for WAMU, a National Public Radio affiliate in Washington, D.C. The station was covering public transportation failures in the city, specifically of the D.C. Metro subway and train service. We designed a game to increase audience engagement with the material.

WAMU sent an experienced reporter, Maggie Farley, into the field to interview a variety of Metro riders about their experience. We aggregated those stories into a single narrative and then made that story playable. In our “Commuter Challenge,” players have to make it through a week on the Metro system as a low-wage employee in the D.C. Metro service area.

The problems facing players align with real-world trade-offs the reporter found people making: Should a worker choose a pricey ride-share service to get to the daycare in time for pickup or save money by taking the train but risk incurring fees for being late? Should a worker trust the announcement that the train will be only 15 minutes late or decline an extra shift because of rail service outages? Players have to balance their family, work and financial demands, in hopes of ending the week without running out of money or getting fired for being late to work.

Boosting connections

WAMU found that the game got four times more visits than other Metro-related articles on its site. And people spent four times longer playing the game than they did reading or listening to the standard news coverage. People, it seemed, were far more eager to play a game about the plight of Metro riders than they were to hear about it.

Most recently, we released a game called “Factitious.” It works like a simple quiz, giving players a headline and an article, at the bottom of which is a link to reveal the article’s source. Players must decide whether a particular article is real news or fake. The game tells the player the correct answer and offers hints for improvement. This helps players learn the importance of reading skeptically and checking sources before deciding what to believe.

In addition, for each article we can see how many people understood or misunderstood it as real or fake news and how long they took to make the decision. When we change headlines, images or text, we can monitor how players’ responses adjust, and report to news organizations on how those influence readers’ understanding. We hope games like this one become a model for getting honest feedback from the general population.

While the original “Monopoly” aimed to explain the drawbacks of land grabbing, contemporary persuasive play has even grander hopes. This new generation of games aims to learn about its players, change their perceptions and revise their behavior in less time than it takes to build a hotel on Park Place.

Lindsay Grace, Associate Professor of Communication; Director, American University Game Lab and Studio, American University School of Communication

This article was originally published on The Conversation. Read the original article.

The first results from the Juno mission are in – and they already challenge our understanding of Jupiter

Leigh Fletcher, University of Leicester

Ten months after its nerve-wracking arrival at Jupiter, NASA’s Juno mission has started to deliver – forcing scientists to reevaluate what they thought they knew about the giant planet. The first findings from Juno, published in Science, indicate that many aspects of Jupiter have defied expectation – including the strength of its magnetic field, the shape of its core, the distribution of ammonia gas and the weather at its poles. It certainly makes this an exciting time to be a Jupiter scientist. The Conversation

Juno arrived at Jupiter in July 2016, and began a long, looping first orbit that took it far out from the planet before zipping back in for its first scientific close-up (perijove) on August 27. It is this fleeting meeting that the new studies are based on. Today, despite initial problems with Juno’s engine and spacecraft software, the mission has settled into a regular pattern of close perijoves every 53.5 days – the sixth such flyby happened on May 19, the seventh will be on July 11.

Mysteries in the deep

Jupiter seen by Juno.
Credits: NASA/JPL-Caltech/SwRI/MSSS/Gabriel Fiset

One of Juno’s key strengths is its ability to peer through the overlying shroud of clouds to study the gases below, such as the cloud-forming substance ammonia. Flows of ammonia help form Jupiter’s distinctive features. The gas was expected to be well mixed, or drenched, below the topmost clouds. That idea has been turned on its head – the ammonia concentration is much less than expected.

Intriguingly, much of the ammonia is concentrated into an equatorial plume, rising from the depths of Jupiter to the cloud-tops due to some powerful up-welling force. Scientists are likening this to Earth’s Hadley cell, with plumes dredging ammonia up from hundreds of kilometres below.

We’ve known that ammonia is enhanced at Jupiter’s equator for some time, but we never knew how deep this column went. However, it’s important to remember that this is only one location on Jupiter, and that Earth-based infrared observations suggest that the plume may not be this strong elsewhere around Jupiter’s equator, but could be patchy. Only with more perijove passes will we begin to understand the strange dynamics of Jupiter’s tropics.

We’ve never been able to see this deep before, so even the first observations from Juno’s microwave instrument provide a treasure trove of new insights. These show that the banded structure that we see at the surface is really just the tip of the iceberg – Jupiter exhibits banding all the way down to 350km. This is much deeper than what we’ve generally thought of as Jupiter’s “weather layer” in the upper few tens of kilometres. What’s more, that structure isn’t the same all the way down – it varies with depth, indicating a large, complex circulation pattern.

Gravity and magnetic fields

The surprises didn’t stop here. Juno can probe even deeper into the planet by monitoring small tweaks to the spacecraft’s orbit by the gravity field of Jupiter’s interior. Ultimately, these will be used to assess Jupiter’s core, although that cannot be done from a single perijove pass. Most scientists believe that the planet has a dense core made up of around ten Earth masses of heavy elements and occupying a small fraction of the radius. But the new measurements are inconsistent with any previous model – possibly hinting at a “fluffy” core dispersed out to half of Jupiter’s radius.

Indeed, Jupiter’s interior appears to be anything but uniform. We have to remember that scientists have spent years developing models of the interior of Jupiter based on sparse data taken from great distances – Juno is now testing these models to the extreme because it is flying so close.

Infrared image showing Jupiter’s aurora (blue) and internal glow (red).
Credit: NASA/JPL-Caltech/SwRI/ASI/INAF/JIRAM

Jupiter has the most intense planetary magnetic field in the solar system, causing a pile-up where the solar wind is slowed down (known as the bow shock). Juno first passed through this region and into the Jovian magnetosphere on June 24. At its closest approach, the strength of Jupiter’s magnetic field was twice as strong as any model had predicted and much more irregular.

That’s important, because it suggests that the magnetic field could be generated at shallower depths than expected, above the “metallic hydrogen” layer that is thought to exist at very high pressures. If proven, this has substantial implications for studies of magnetic fields at all of the giant planets. Perhaps the Cassini mission will be able to confirm whether this is the case as it makes its final measurements of Saturn’s magnetic field before it crashes into the planet in September.

Chaos at the poles

If you’ve ever been lucky enough to see Jupiter through a telescope, you’ll be familiar with the organised, striped structure of whiter zones and dark brown belts. These colourful bands are bordered by powerful jet streams whizzing east and west around the planet. On Saturn, this organised, banded structure persists all the way to the poles, with one jet showing a strange hexagonal wave pattern encircling a hurricane-like polar cyclone. But Jupiter’s poles are different. Gone is the organised structure of jets. There’s no evidence for hexagons or anything like it. And instead of one cyclone, we see multitudes, surrounded by a whole host of chaotic and turbulent features.

Jupiter’s poles.
J.E.P. Connerney et al., Science (2017)]

With the ability to see structures as small as 50km, Juno’s camera has revealed numerous bright cyclones of a variety of appearances – some appear sharp, some have clear spirals, some are fluffy and diffuse, and the largest is some 1400km across. That’s about the same distance between London and Majorca. These bright storms sit on top of dark clouds, giving the appearance of “floating” on a dark sea, and it will be some time before we understand the lifetimes and motions of these storms.

You might imagine that, faced with throwing out models that have taken careers to develop, scientists might be a little glum. But the exact opposite is true. A mission like Juno, accessing regions that no robotic spacecraft has ever probed before, is designed to test the models to the extreme. If they break, then the search to find the missing pieces of the puzzle will provide deeper insights into the physics of the Jovian system. All these surprises have come from just the first perijove encounter, and I’m sure there are plenty more revelations to come.

Leigh Fletcher, Royal Society Research Fellow, University of Leicester

This article was originally published on The Conversation. Read the original article.

Scientists are accidentally helping poachers drive rare species to extinction

Benjamin Scheele, Australian National University and David Lindenmayer, Australian National University

If you open Google and start typing “Chinese cave gecko”, the text will auto-populate to “Chinese cave gecko for sale” – just US$150, with delivery. This extremely rare species is just one of an increasingly large number of animals being pushed to extinction in the wild by animal trafficking. The Conversation

What’s shocking is that the illegal trade in Chinese cave geckoes began so soon after they were first scientifically described in the early 2000s.

It’s not an isolated case; poachers are trawling scientific papers for information on the location and habits of new, rare species.

As we argue in an essay published today in Science, scientists may have to rethink how much information we publicly publish. Ironically, the principles of open access and transparency have led to the creation of detailed online databases that pose a very real threat to endangered species.

We have personally experienced this, in our research on the endangered pink-tailed worm-lizard, a startling creature that resembles a snake. Biologists working in New South Wales are required to provide location data on all species they discover during scientific surveys to an online wildlife atlas.

But after we published our data, the landowners with whom we worked began to find trespassers on their properties. The interlopers had scoured online wildlife atlases. As well as putting animals at risk, this undermines vital long-term relationships between researchers and landowners.

The endangered pink-tailed worm-lizard (Aprasia parapulchella).
Author provided

The illegal trade in wildlife has exploded online. Several recently described species have been devastated by poaching almost immediately after appearing in the scientific literature. Particularly at risk are animals with small geographic ranges and specialised habitats, which can be most easily pinpointed.

Poaching isn’t the only problem that is exacerbated by unrestricted access to information on rare and endangered species. Overzealous wildlife enthusiasts are increasingly scanning scientific papers, government and NGO reports, and wildlife atlases to track down unusual species to photograph or handle.

This can seriously disturb the animals, destroy specialised microhabitats, and spread disease. A striking example is the recent outbreak in Europe of a amphibian chytrid fungus, which essentially “eats” the skin of salamanders.

This pathogen was introduced from Asia through wildlife trade, and has already driven some fire salamander populations to extinction.

Fire salamanders have been devastated by diseases introduced through the wildlife trade.
Erwin Gruber

Rethinking unrestricted access

In an era when poachers can arm themselves with the latest scientific data, we must urgently rethink whether it is appropriate to put detailed location and habitat information into the public domain.

We argue that before publishing, scientists must ask themselves: will this information aid or harm conservation efforts? Is this species particularly vulnerable to disruption? Is it slow-growing and long-lived? Is it likely to be poached?

Fortunately, this calculus will only be relevant in a few cases. Researchers might feel an intellectual passion for the least lovable subjects, but when it comes to poaching, it is generally only charismatic and attractive animals that have broad commercial appeal.

But in high-risk cases, where economically valuable species lack adequate protection, scientists need to consider censoring themselves to avoid unintentionally contributing to species declines.

Restricting information on rare and endangered species has trade-offs, and might inhibit some conservation efforts. Yet, much useful information can still be openly published without including specific details that could help the nefarious (or misguided) to find a vulnerable species.

There are signs people are beginning to recognise this problem and adapt to it. For example, new species descriptions are now being published without location data or habitat descriptions.

Biologists can take a lesson from other fields such as palaeontology, where important fossil sites are often kept secret to avoid illegal collection. Similar practices are also common in archaeology.

Restricting the open publication of scientifically and socially important information brings its own challenges, and we don’t have all the answers. For example, the dilemma of organising secure databases to collate data on a global scale remains unresolved.

For the most part, the move towards making research freely available is positive; encouraging collaboration and driving new discoveries. But legal or academic requirements to publish location data may be dangerously out of step with real-life risks.

Biologists have a centuries-old tradition of publishing information on rare and endangered species. For much of this history it was an innocuous practice, but as the world changes, scientists must rethink old norms.

Benjamin Scheele, Postdoctoral Research Fellow in Ecology, Australian National University and David Lindenmayer, Professor, The Fenner School of Environment and Society, Australian National University

This article was originally published on The Conversation. Read the original article.

No problem too big #1: Artificial intelligence and killer robots

Adam Hulbert, UNSW

This is the first episode of a special Speaking With podcast series titled No Problem Too Big, where a panel of artists and researchers speculate on the end of the world as though it has already happened. The Conversation


It’s not the world we grew up in. Not since artificial intelligence. The machines have taken control.

Three fearless researchers gather in the post-apocalyptic twilight: a computer scientist, a mechanical engineer and a sci-fi author.

Together, they consider the implications of military robots and autonomous everything, and discover that the most horrifying post-apocalyptic scenario might look something like unrequited robot love.


Joanne Anderton is an award-winning author of speculative fiction stories for anyone who likes their worlds a little different. More information about Joanne and her novels can be found here.


No Problem Too Big is created and hosted by Adam Hulbert, who lectures in media and sonic arts at the University of New South Wales. It is produced with the support of The Conversation and University of New South Wales.

Sound design by Adam Hulbert.

Theme music by Phonkubot.

Additional music:

Beast/Decay/Mist by Haunted Me (via Free Music Archive)

Humming Ghost by Haunted Me (via Free Music Archive)

Additional audio:

Stephen Hawking interview, BBC News

Adam Hulbert, Sonic Arts Convener, UNSW

This article was originally published on The Conversation. Read the original article.

African scientists are punching above their weight and changing the world

John Butler-Adam, University of Pretoria

Over the past five years, Africa’s contributions to the world’s research –- that is, new knowledge –- have varied from a low of 0.7% to the present and highest level of 1.1%. The Conversation

There are many reasons for Africa’s small contribution to world research. One of them, sadly, is that at least some of this new knowledge is produced by African scientists working beyond their own countries and continent. Many have chosen to leave because they feel the facilities and funding opportunities are better than those “at home”.

It’s also important to point out that the sum of knowledge generated each year, including Africa’s contribution to it, is measured using research articles published by scientists and scholars in scientifically recognised journals. This means some of the actual work that’s being done isn’t getting the attention or credit it deserves, yet. The journal system is not a perfect way of assessing scientific productivity. For now, though, it’s a means that can be applied fairly to document peer reviewed research from around the world.

These concerns aside there is, I’m happy to report, much to celebrate about research in Africa. For starters, the world’s largest collection of peer-reviewed, African-published journals, is growing all the time. African Journals Online currently carries 521 titles across a range of subjects and disciplines.

Women researchers are also well represented, though there’s still work to be done: three out of 10 sub-Saharan researchers are women.

The continent’s researchers are working on challenges as varied as astrophysics, malaria, HIV/AIDS and agricultural productivity. They are making significant advances in these and many other critical areas. The projects I talk about here are just a few examples of the remarkable work Africa’s scientists are doing on and for the continent.

A range of research

Africa is establishing itself as global player in astronomical research. The Southern African Large Telescope (SALT) is the largest single optical telescope of its kind in the Southern hemisphere. Work undertaken at this facility, in South Africa’s Northern Cape province, has resulted in the publication of close to 200 research papers.

The telescope has support from and working relationships with universities in 10 countries. Its recent work helped a team of South African and international collaborators to uncover a previously unknown major supercluster in the constellation Vela.

SALT has two siblings: MeerKAT, which is already producing results, and the Square Kilometre Array, which is still being developed.

In a very different sphere, Professors Salim and Quarraisha Abdool Karim have won African and international awards for their groundbreaking and lifesaving work in the area of HIV/AIDS. Professor Glenda Gray, the CEO of South Africa’s Medical Research Council, has been honoured by Time magazine as one of the world’s 100 most influential people. She, too, is a pioneer in HIV/AIDS research.

In Kenya, dedicated research institutes are tackling agricultural challenges in areas like crop production and livestock health. This not only boosts Africa’s research output, but contributes greatly to rural development on the continent.

Elsewhere, Nigeria has established a number of research institutes that focus on a range of agricultural challenges. Research is also being undertaken in the important area of oceanography.

Although it operates from the University of Cape Town, the African Climate and Development Initiative has been working as a partner in Mozambique. There it’s addressing the critical – and interrelated – challenges of climate change and adaptation responses for horticulture, cassava and the red meat value chain. This is important work in one of Africa’s poorest countries, which is battling drought and hunger.

And then there’s also research “out of Africa”. This involves discoveries about the human past and the origins of homo sapiens. Historically, this sort of research was often undertaken by people who didn’t come from Africa. More and more, though, African scholars have come to the fore. The scientists who discovered a new human ancestor and mapped a cave system that’s serving up amazing fossil evidence are following in giant footsteps: those of Robert Broom, Raymond Dart and Phillip Tobias.

Research that matters

What does all of this tell us about research in Africa? Perhaps three ideas are worth considering.

First, while Africa and its universities, institutes and scientists need to make far greater contributions to world knowledge, high quality and important research is happening. Its overall contribution might be small, but smart people are undertaking smart and important work.

Secondly, the range of research being undertaken is remarkable in view of the size of Africa’s overall contribution: from galaxies to viruses; from agriculture to malaria; and from drought to oceanography.

And thirdly it is notable, and of great significance, that irrespective of the disciplines involved, the research is tackling both international concerns and those specific to the African continent and its people’s needs.

Yes, 1.1% is a small figure. What’s actually happening, on the other hand, adds up to a pretty impressive score card.

John Butler-Adam, Editor-in-Chief of the South African Journal of Science and Consultant, Vice Principal for Research and Graduate Education, University of Pretoria

This article was originally published on The Conversation. Read the original article.

To stay in the game universities need to work with tech companies

Martin Hall, University of Cape Town

The world of higher and professional education is changing rapidly. Digitally-enabled learning, in all its forms, is here to stay. Over the last five years, massive open online courses (MOOCs) have enabled universities to share their expertise with millions across the world. This shows how rapidly developing digital technologies can make learning accessible. The Conversation

These new technologies are shaking up traditional classrooms, too. And as the nature of work changes professionals are turning to high level, online courses to keep pace with new demands.

But much of this new technology is the preserve of private sector companies. This means that universities have to work with them. Yet partnerships with for-profit companies still don’t feel right for many in the higher education sphere. Knowledge has long been seen as a public good, and education as a basic right. Many of today’s universities were shaped by the principles of public funding.

This world was changing well before the disruptive impact of digital technologies, with tuition fees rising above the rate of inflation and the emergence of private universities as part of the higher education landscape. But there’s still unease about technology and its role. The reality, though, is that higher education institutions will have to get over their queasiness if they’re to survive in this brave new world.

Universities may not have the know how or the money to match the innovations coming onto the market through private tech companies. The decision by Nasdaq-listed technology education (edtech) company 2U to acquire Cape Town based startup GetSmarter for R1,4bn ($103million) is the largest price tag yet for a South African company working in digital education.

This is an indication of what it would cost a university to set up a full online division. Few institutions will have this money, or the ability to raise it. The alternative is to reconsider the advantages of public-private partnerships, taking care to retain authority over quality. For many universities this could be the only way of keeping pace with the changing world of education.

The story of a start up

The story of how GetSmarter got off the ground is a text book case of how a simple idea, combined with guts and luck, can reap huge rewards.

GetSmarter was launched in 2008 with a tiny budget and offered just one online course, in wine evaluation. By 2016 its annual revenues had grown to about R227 million. The foundation for this expansion has been a wide range of courses developed and offered in partnership with the University of Cape Town and, more recently, the University of the Witwatersrand and Stellenbosch University.

GetSmarter’s key breakthrough into the international realm came with professional programmes in association with the Massachusetts Institute of Technology (MIT) and Cambridge University. GetSmarter’s first course with HarvardX will soon be presented.

After its acquisition was announced I talked to the company’s CEO, Sam Paddock, co-founded with brother Rob. We discussed the lessons for other small digital companies – and for universities that are mulling the value of digital learning.

The Paddock brothers leveraged the cash flow from their father’s niche law firm to launch their first online course. They then used upfront payments for that course and the courses that followed to keep financing their next offerings. In the nine years that followed, edtech has become a crowded and complex field.

GetSmarter’s purchase price has garnered a lot of media attention: it’s high, in US dollar terms, and is a vote of confidence in the company. The price represents a valuation of a company’s assets, intellectual property and know-how, and strategic positioning for the future.

But what does it say about the kinds of investments and partnerships that conventional universities will have to make as they adapt to the full disruption from new digital technologies? The key aspect of GetSmarter’s success is how its partnership with universities has played out. As Paddock points out:

We are starting to realise the potential of public-private partnerships, where the credibility and resources of great universities can be combined with the skills of nimble private operators.

Good news for the digital economy

This acquisition is also good news for South Africa’s digital economy. Paddock says GetSmarter will employ more South African graduates and give them international experience and expertise.

And, he says, ecosystems often develop from one significant investment in an individual company. “This was how Silicon Valley started, as well as London’s ”silicon roundabout“. Cape Town, GetSmarter’s home city, has been trumpeted as South Africa’s own Silicon Valley: ”Silicon Cape“.

The opportunity to lead in digital innovation and application has been widely recognised, for example through the work of Accelerate Cape Town. The Cape Innovation and Technology Initiative (CiTi) has a range of initiatives underway, including a three year partnership with Telkom intended to build the digital workforce.

Last year, cellphone giant Vodacom announced an investment of R600m to assist in developing South Africa’s digital skills.

GetSmarter’s big win is good news and proof – if universities needed it – that such initiatives can bolster higher education’s offering in a rapidly changing world. Universities in Africa know that they need to keep up with the relentless march of digitally enabled learning. GetSmarter’s journey from bootstrapped startup to a billion rand enterprise is a case study, worthy of attention.

Martin Hall, Emeritus Professor, MTN Solution Space Graduate School of Business, University of Cape Town

This article was originally published on The Conversation. Read the original article.

Why using AI to sentence criminals is a dangerous idea

Christopher Markou, University of Cambridge

Artificial intelligence is already helping determine your future – whether it’s your Netflix viewing preferences, your suitability for a mortgage or your compatibility with a prospective employer. But can we agree, at least for now, that having an AI determine your guilt or innocence in a court of law is a step too far?

Worryingly, it seems this may already be happening. When American Chief Justice John Roberts recently attended an event, he was asked whether he could forsee a day “when smart machines, driven with artificial intelligences, will assist with courtroom fact finding or, more controversially even, judicial decision making”. He responded: “It’s a day that’s here and it’s putting a significant strain on how the judiciary goes about doing things”.

Roberts might have been referring to the recent case of Eric Loomis, who was sentenced to six years in prison at least in part by the recommendation of a private company’s secret proprietary software. Loomis, who has a criminal history and was sentenced for having fled the police in a stolen car, now asserts that his right to due process was violated as neither he nor his representatives were able to scrutinise or challenge the algorithm behind the recommendation.

The report was produced by a software product called Compas, which is marketed and sold by Nortpointe Inc to courts. The program is one incarnation of a new trend within AI research: ones designed to help judges make “better” – or at least more data-centric – decisions in court.

While specific details of Loomis’ report remain sealed, the document is likely to contain a number of charts and diagrams quantifying Loomis’ life, behaviour and likelihood of re-offending. It may also include his age, race, gender identity, browsing habits and, I don’t know … measurements of his skull. The point is we don’t know.

What we do know is that the prosecutor in the case told the judge that Loomis displayed “a high risk of violence, high risk of recidivism, high pretrial risk.” This is standard stuff when it comes to sentencing. The judge concurred and told Loomis that he was “identified, through the Compas assessment, as an individual who is a high risk to the community”.

The Wisconsin Supreme Court convicted Loomis, adding that the Compas report brought valuable information to their decision, but qualified it by saying he would have received the same sentence without it. But how can we know that for sure? What sort of cognitive biases are involved when an all-powerful “smart” system like Compas suggests what a judge should do?

Unknown use

Now let’s be clear, there is nothing “illegal” about what the Wisconsin court did – it’s just a bad idea under the circumstances. Other courts are free to do the same.

Worryingly, we don’t actually know the extent to which AI and other algorithms are being used in sentencing. My own research indicates that several jurisdictions are “trialling” systems like Compas in closed trials, but that they cannot announce details of their partnerships or where and when they are being used. We also know that there are a number of AI startups that are competing to build similar systems.

However, the use of AI in law doesn’t start and end with sentencing, it starts at investigation. A system called VALCRI has already been developed to perform the labour-intensive aspects of a crime analyst’s job in mere seconds – wading through tonnes of data like texts, lab reports and police documents to highlight things that may warrant further investigation.

The UK’s West Midlands Police will be trialling VALCRI for the next three years using anonymised data – amounting to some 6.5m records. A similar trial is underway from the police in Antwerp, Belgium. However, past AI and deep learning projects involving massive data sets have been have been problematic.

Benefits for the few?

Technology has brought many benefits to the court room, ranging from photocopiers to DNA fingerprinting and sophisticated surveillance techniques. But that doesn’t mean any technology is an improvement.

Algorithms can be racist, too.
Vintage Tone/Shutterstock

While using AI in investigations and sentencing could potentially help save time and money, it raises some thorny issues. A report on Compas from ProPublica made clear that black defendants in Broward County Florida “were far more likely than white defendants to be incorrectly judged to be at a higher rate of recidivism”. Recent work by Joanna Bryson, professor of computer science at the University of Bath, highlights that even the most “sophisticated” AIs can inherit the racial and gender biases of those who create them.

What’s more, what is the point of offloading decision making (at least in part) to an algorithm on matters that are uniquely human? Why do we go through the trouble of selecting juries composed of our peers? The standard in law has never been one of perfection, but rather the best that our abilities as mere humans allow us. We make mistakes but, over time, and with practice, we accumulate knowledge on how not to make them again – constantly refining the system.

What Compas, and systems like it, represent is the “black boxing” of the legal system. This must be resisted forcefully. Legal systems depend on continuity of information, transparency and ability to review. What we do not want as a society is a justice system that encourages a race to the bottom for AI startups to deliver products as quickly, cheaply and exclusively as possible. While some AI observers have seen this coming for years, it’s now here – and it’s a terrible idea.

An open source, reviewable version of Compas would be an improvement. However, we must ensure that we first raise standards in the justice system before we begin offloading responsibility to algorithims. AI should not just be an excuse not to invest.

While there is a lot of money to be made in AI, there is also a lot of genuine opportunity. It can change a lot for the better if we get it right, and ensure that its benefits accrue for all and don’t just entrench power at the top of the pyramid.

I have no perfect solutions for all these problems right now. But I do know that when it comes to the role of AI in law we must ask in which context they are being used, for what purposes and with what meaningful oversight. Until those questions can be answered with certainty, be very very sceptical. Or at the very least know some very good lawyers.

Christopher Markou, PhD Candidate, Faculty of Law, University of Cambridge

This article was originally published on The Conversation. Read the original article.

How football clubs fail and succeed after reaching England’s Premier League

Rob Wilson, Sheffield Hallam University and Dan Plumley, Sheffield Hallam University

Football always divides opinion. As the latest English season draws to a close and the Football League playoffs take centre stage, there will be some that grumble about the format. They will say how “unfair” it is that a club can finish third in the league in the regular season, yet be denied promotion by a club that finished sixth after a late surge. Set that aside though, and you are left with the pure drama. It is win or bust, and prolongs the excitement of the regular season, giving more teams, more to play for in a crescendo of late season fixtures.

The playoffs concept was borrowed from US team sports where this end-of-season competition is a regular feature, attracting huge media exposure and significant commercial interest. In England, for thirty years now, the playoffs have determined the final promotion spot within each division of the Football League. Four teams first try to get to the playoff final at Wembley stadium, then face a nerve-jangling 90 minutes or more to secure a step up the football pyramid.

The inspiration from US sports is important. Put aside the passion, excitement, disappointment and any sense of injustice for a moment. The playoffs can be of huge importance financially. A playoff victory can have the power to stabilise a club’s financial position, clear debts and allow significant investment in players. The pot of gold at the end of this rainbow has largely been filled with TV money. The most recent domestic deal was signed for £5.14 billion. Add in the international rights and this swells to £8.4 billion.

Lower down the leagues, the money on offer is not eye-watering. Our conservative estimates put the prize at around £500,000 for promotion from League Two to League One and around £7m for promotion from League One to the Championship. However, the prize on offer for promotion to the Premier League is staggering and has led to the Championship playoff final being labelled the “richest game in football” with a value of around £170m-£200m. Huddersfield, Reading, Fulham and Sheffield Wednesday are facing off for the jackpot this time around.

Revenue generator

The often-quoted £200m figure is a little misleading as it takes into account so-called parachute payments which only kick in if a club is relegated the following season. Clubs will receive a minimum uplift of £120m though, which can be triple or quadruple their turnover. In fact, the chart below shows that when Bournemouth was promoted in 2015, the club saw a six-fold increase in revenue, essentially driven by additional broadcasting fees.

When the prize is so very shiny, straining to reach for it presents a strategic dilemma for clubs. The boost to revenue from promotion can stabilise a club financially, just like it did for Blackpool in 2010, helping it to (theoretically) secure a long-term future. In Blackpool’s case, however, on-field performance was destabilised and supporters became disenfranchised. Seven years later, Blackpool now hope to be promoted back to League One this season, via the playoffs.

Promotion can also increase the level of expectation and create pressure to retain a position in the world’s richest league. The club can get excited and the board can sanction acquisitions that fall outside a reasonable budget and seriously threaten the short and even long-term financial future of the club. This recalls the experience at Queens Park Rangers, which somehow accumulated £143m of losses despite generating about £250m in revenue during their stay in the Premier League. QPR managed to spend a startling £285m on wages and £114m on player purchases, while their level of debt surged to a peak of £194m.

Prepare to fail

The third option is to rein in your ambition, develop a strategic plan, grow incrementally and accept that you may become a yo-yo club like Burnley, or survive by the skin of your teeth like Stoke City.

Either way, the club builds a longer term future at the top table which benefits everyone. Survival through this approach means that a club receives at least another £120m so can build still further and become a stable Premiership club. But even failing and being relegated means a club will still have money to spend, receive a parachute payment (of another £45m or so) and spend a season in the Championship with turnover in excess of three times that of a standard team. This provides a significant competitive advantage over your rivals as Newcastle United showed this year – the Magpies spent big and gained promotion at the first attempt.

Ultimately, the direction of travel comes down to owner objectives, which can differ depending on their background and motivations. One thing that is clear: spending beyond your means does not always guarantee success.

The chart above allows us to examine a club’s transfer spending in the year following promotion. It is a confusing picture, but the red bars show those clubs which were relegated the following season, and demonstrate clearly that spending big is no guarantee of survival. This chart doesn’t show the starting point for each club in terms of player quality, but how you spend it is plainly crucial, and the chart shows too that you can survive without throwing the kitchen sink at player acquisitions.

There is broader evidence that the most successful clubs, with the most money, do tend to outperform, but the trade-off between financial and sporting performance is hazardous. Many clubs now choose to chase multiple and escalating objectives: recall the devastating failure at Leeds United in 2003, when creditors were owed almost £100m after the club chased the dream of playing in the Champions League. You chase that dream at your peril is the warning; plan carefully, and spend wisely is the advice to your board. Relegation doesn’t have to be a trapdoor, but promotion can be a trap.

Rob Wilson, Principal Lecturer in Sport Finance, Sheffield Hallam University and Dan Plumley, Senior Lecturer in Sport Business Management, Sheffield Hallam University

This article was originally published on The Conversation. Read the original article.