SHARE
Startups: Innovation Through Inferiority!

Essays on technology, psycho­analysis, philosophy, design, ideology & Slavoj Žižek
ABOUT THE AUTHOR


Appreciation

June 30, 2013

Startups: Innovation Through Inferiority!

The primary assumption in the concept of the digital divide is that new technology is better. That’s a little obvious, but it’s also significant because the digital divide is (or at least was) one of the most important issues at the intersection of technology and inequality raised by mainstream progressive Democrats. Technology is good, say digital divide activists, but we just want to make sure that everyone can take advantage of the benefits!

Despite the libertarian-ish influence, Silicon Valley companies have never opposed this line of thinking. And why would they? It fits with their celebration/demand for participation and encourages purchasing computers from them to send to inner city classrooms and so on. And having institutions like the United Nations endorse your products by promoting them alongside similar efforts for clean drinking water and child nutrition is pretty good PR.

Some people a little further to the left of the UN also insist that we need to be pro-Technology. It’s hard to disagree with that in the abstract, but in our rush to embrace Technology, we might stop to ask whether these specific technologies are really so wonderful. What about the flying cars? The moon colonies? The jetpacks?

Maybe these are a bit fanciful, but even less ambitious technologies are apparently not sustainable. What about a decent RSS reader, email client, or any number of other modestly useful services that have been disconnected because they lacked a viable monetization strategy? We’re approaching at least 20 years since the first breathless predictions about the power of the internet to transform everything, and by now the prophesied future has arrived. But why has it fallen short of expectations? It could be that interesting ideas were tried, but couldn’t find the much-sought product/market fit. Another more interesting possibility is that as a rule, Silicon Valley doesn’t deliver technological progress at all. Instead of giving us newer, better technologies, what if Silicon Valley is actually set up to destroy technologies and replace them with inferior implementations?

Clayton Christensen, a professor at Harvard Business School, wrote a series of books on innovation that are widely read and extremely influential in Silicon Valley. His central thesis is the concept of disruptive innovation. Under this theory, a new company competes with established incumbents by creating cheaper, inferior product that serve the low-end market that can’t afford existing products.

Incumbents don’t create disruptive innovations—they’re unattractive because they offer lower profits than high end offerings. Instead, incumbents seek sustaining innovations which are what we’re normally thinking of when we talk about technological progress: products that are faster, better, more efficient, like the Toyota Prius. This opens up the opportunity for entrepreneurs to undermine them from below: gain market traction with a cheap low-quality product, then improve it just enough that it threatens the high-end market, possibly driving it out of business.

Christensen gives many examples, one of which is education. The incumbent is the traditional four-year college modeled on the Ivy League and dedicated to research and teaching. The disruptive innovations threatening their business model—in this case with successive degradations in quality—are first community colleges and for-profit college like University of Phoenix, and now MOOCs.

What this means is that Silicon Valley venture-backed startup companies generally make worse, not better products. Their main advantage is that they are cheaper and therefore more accessible to the average person, which enables populist-sounding marketing. So Airbnb is cheaper but worse than a hotel; blogs are cheaper but worse than newspapers; user-generated content in general is free but worse than professionally produced content.

And MOOCs are cheaper but worse than college lectures. The purportedly left-leaning Center for American Progress published a report where Christensen, with a stunning series of euphemisms, advocates for lowering the quality of education to make it more accessible:

Our country’s dominant higher education policies have focused on expanding access for more than half a century—allowing more students to afford higher education. Yet changing circumstances mandate that we shift the focus of higher education policy away from how to enable more students to afford higher education to how we can make a quality postsecondary education affordable. The challenge before the country also mandates a new definition of quality from the perspective of students—so that the education is valuable to them and that through it they improve their lives and thus improve the country’s fortunes, too. And if a postsecondary education is fundamentally affordable—meaning lower in cost, not just price—this will also answer the question of how to extend access by enabling students to afford a higher education.

The ‘changing circumstances’ are the rise of neoliberalism and the demand of the wealthy for a more unequal society. The ‘new definition of quality’ is simply low quality education more tailored to the needs of employers where students are not educated in the traditional sense, but trained to ‘hit the ground running’ once they graduate. The format shifts from a classroom to perverse form of internship where students pay employers to work1. And the phrase ‘lower in cost, not just in price’ might well mean ‘lower in cost, but not in price’. The University of Phoenix offers online degrees for $14,000 a year, a price that’s 75% higher for an inferior product compared to most state universities. But the cost is lower, making it a “disruptive innovation”.

Christensen often points out that the innovation in disruptive innovation isn’t the technology. In most cases, the products are novel uses of off-the-shelf components, not born of sustained R&D investment. The innovations are of business models, and Silicon Valley’s disruptions often achieve their rock-bottom prices by lowering quality and eliminating labor costs, as in MOOCs and user-generated content businesses. We can hope that the challengers can grow larger than the incumbents and create more jobs than they destroy, but this is not guaranteed. When administrative assistants were disrupted by inflexible, complicated groupware suites, personal information managers and endless email, the job losses were probably not offset by an equivalent number of new jobs creating and administering the automated systems.

Returning to the opening point about the digital divide, it turns out that a lot of well-meaning activism that wants the “less fortunate” to have access to what are assumed to be new, advanced technologies are inadvertently creating an aura of social benefit for possibly job-destroying business model innovations that produce inferior products.

MOOCs are a development that clearly turns the logic of the digital divide on its head. In this case, the poor will be stuck with the high-tech innovations, while the upper middle-class and rich will continue to enjoy the older form of in-person education.

Another example is smartphones, which are generally regarded as an advancement over desktop computers. Micah Sifry, founder of the Personal Democracy Foundation, celebrates smartphones supposedly enabling “a kind of ‘leapfrog effect’ over the digital divide” because minorities in the United States are disproportionately represented among those who primarily access the internet on their smartphones.

Sifry has somehow missed that smartphones are inferior to desktops on almost every dimension: processing speed, hard drive size, memory, graphics processing power, screen size, etc. Low income smartphone owners probably subscribe to the cheapest data plans and may not have broadband at home, so access to any kind of video or even audio content may be quite limited.

Finally, mobile app design often begins with the assumption that users will have a desktop and possibly other devices like a tablet as well. Google recently published a report entitled The New Multi-Screen World claiming that consumers often use smartphones, tablets, desktop computers and TVs sequentially—moving between devices to complete a task—or simultaneously—completing a task using multiple devices.

The study asserts “we are a nation of multi-screeners”, which may well be true for the majority of middle class consumers, but leads mobile app designers to drop less-used information and features because they assume that a desktop or laptop computer will be available if users really want to access them. Interaction designer Karen McGrane says:

One of the most persistent misconceptions about mobile devices is that it’s okay if they offer only a paltry subset of the content available on the desktop. Decision-makers argue that users only need quick, task-focused tools on their mobile devices, because the desktop will always be the preferred choice for more in-depth, information-seeking research.

Sifry’s belief that just having a smartphone means you’ve leapfrogged over everyone else really starts to get shaky here. Many of those users don’t have access to content and features because mobile websites are built for middle class users in multi-screen contexts. For anyone working on digital divide issues or government agencies with legal or organizational mandates to make information accessible, these are serious issues.

Novelty and technological development are conflated far too often, so that anyone who finds any fault with the products of Silicon Valley startups must be against progress in general, and this has some quite negative effects. First and most obvious, we are left unable to criticize these technologies and the ways they change society for fear of looking like a Luddite, celebrating some potentially harmful things just because they are new and assumed to be better. Second, if we want to expand access to some technologies (like say education), maybe it would be a good idea to just give people more money rather than “innovating” by creating cheap garbage so they can afford it. And finally, we might be missing out on many of the benefits of information technologies because we are financially and emotionally overinvested in startups that act primarily as disrupters and destroyers of other kinds of technologies. Perhaps with the right kind of investment in R&D by governments and established companies, other kinds of sustainable innovations could be built.


  1. Even in some traditional four-year universities, departments have corporate affiliate programs where industry representatives can influence course curricula. These classes may be taught by adjunct professors who work either directly for an affiliate or in same the industry. The class format is based around projects, and professors connect students with their industry contacts who provide the project work and largely guide student progress. The final is a presentation in front of a group of judges from the industry who influence the grade. This is a fully corporate controlled classroom where a group of 40 workers collectively pay the university over $120,000 for the right to work on short semester-long contracts.