The problems that come with scaling online platforms.
Network effects are the most exciting aspect of Platform Thinking. Platform Thinking is an approach to business which looks at an online business as being composed of two elements: platform and value created on the platform. YouTube provides the platform, users create the value (videos) on the platform. KickStarter provides a platform but users create projects and other users create value by funding those projects.
Most online platforms have very little or no value of their own. The value is create by users and as more users join in, more value is created, which over time sets up a positive feedback loop. Hence, the more an online platform scales, the more valuable it becomes. The general belief is that this Network Effect continues to work to grow the platform forward.
However, as I’ve written earlier, Reverse Network Effects may sometimes set in with scale i.e. online networks may become less useful as they scale. I do not imply that all online platforms lose value as they grow. However, in the absence of robust curation, online platforms may lose value as they grow.
Under what conditions do online platforms lose value as they scale?
Since the participants on an online platform create value, an online platform loses value with scale when the participants it allows in OR the information/value that they create are not curated appropriately. Poor curation leads to greater noise which makes the platform less useful.
Let’s look at a few factors that increase noise and drive down the value of online platforms as they scale.
Every online platform is as valuable as the participants it connects. Quora, a popular Q&A site found rapid adoption in Silicon Valley as it connected highly successful early tech adopters, who were experts in their field. Quora’s strong curation mechanism also ensures that the best answers get showcased invariably.
The Quora community has created a deep repository of knowledge, thanks to these experts. However, as Quora scales, many worry that less sophisticated users, entering the system, may increase noise leading to a rapid depletion of value for existing users.
This starts a reverse feedback loop because current experts start abandoning the system owing to the poor quality, which leads to further loss of quality, which in turn leads to other experts leaving. If a loop like that is set into motion, the quality of interactions and of the content created can witness an exponential drop.
We’ve seen this reverse feedback loop work out in the case of ChatRoulette, a network of video chatters that connects you with anyone across the world at random. Since ChatRoulette had absolutely no checks and balances to screen users, it ended up with The Naked Hairy Men Problem. As the network grew, unpoliced, an increasing number of naked hairy men joined in leading to an exodus of other users. As legitimate users fled, the relative noise on the platform increased further leading to a feedback loop that saw the site lose traction st nearly the skyrocketing pace that it had gained it.
Solution: There are two solutions: Either choose who gets access to the platform (Curation of access) or scale the ability of the system to curate content as the system grows larger (Curation of contributions). The former is easier to implement. Quibb, in fact, has built a very high signal community through manual curation. Dating sites like CupidCurated do this too, by curating the men who get access to the site. Platforms like Quora, which do not curate access need extremely sophisticated curation of contributions to scale well and not set the reverse feedback loop in motion.
Wikipedia demonstrates that any online platform is open to abuse. Incorrect Wikipedia articles demonstrate the vulnerability of a user-created platform as much as the voume of the correct ones demonstrate the strength.
The problem of incorrect articles (noise) increases as networks scale as policing these platforms becomes more complicated with scale. In a world of community-created knowledge, who gets access to the community ultimately impacts the knowledge that is created.
Solution: Few systems have succeeded in scaling quality. Wikipedia is a rare example. Monitoring and user privileges were scaled slowly at Wikipedia. This ensures that moderators have a track record of desirable behavior. However, few have replicated Wikipedia’s success which shows how difficult it is to scale such systems.
When exposed to a lot of information, we are likely to read what we agree with. Online systems use filters to personalize the information served to each participant. These filters are often created based on the participant’s past behavior. Over time, this personalization can lead to inadvertent reinforcement of what we already believe in.
YouTube, for example, serves us videos based on what we’ve viewed in the past. Facebook’s news feed works on similar parameters.
As a system scales, this over-personalization can lead to a constant firehose of information that is catered to what we already believe in, not what we need. This can prevent those seeking a solution, from being served a solution that is radically different (and effective) and may over-serve obvious solutions.
Solution: The solution is technological and requires constant tweaking of the algorithms that match information to participants, to prevent the formation of an echo chamber.
Another problem that stems from reinforcement is the Hive mind. If certain forms of behavior are encouraged on a platform during the early days and certain others are discouraged, it runs the risk of leading to a Hive mind as the network scales where certain behaviors get reinforced and established as the desirable behaviors. Reddit is an online network, whose community is often criticized for having a Hive mind.
This can lead to an online community getting too inward and insular (and, hence, of lower overall value) and failing to incorporate the value that diverse participants bring.
Solution: Curation of online behavior is very important during the early days of the community. Under-curation can lead to noise and over-curation can lead to selection bias, leading to a hive mind. Curation needs to be appropriately balanced.
On the internet, value is often conferred by community. E.g. The best answer to a question on Quora is decided by the community through upvotes and downvotes. Value is dynamic and constantly evolving, best exemplified by a Wikipedia article which is in constant flux.
For all its advantages, this dynamic and community-shaped creation of value is also open to inadvertent acceptance. If enough number of participants accept something as true, it becomes the new truth, even if it isn’t. The answer that bubbles to the top and the latest version of an article are all decided by the community, and are a function of the quality of the community.
Solution: This problem is avoided by curating the community through policing who joins the network. Some dating sites curate the men joining the network to mitigate the common problem of women being stalked. Also, platforms like Wikipedia confer greater authority and curation power on power users. Hence, curation at the point of access may be required for some systems.
Consider an online platform that enables sharing of knowledge globally and helps those looking for an answer to connect with those who have the answer. The best contributions don’t always come from existing experts, neither do the existing experts understand the context of needs in remote areas. Hence, micro-experts are needed to deal with the long tail of problems.
The creation of new niche experts, requires a curation model that effectively separates the best from the rest. Creation of experts, traditionally, has been done on the basis of achievements or affiliations with certain trusted bodies. Creating that trust on an online platform is extremely important if one is to create new experts.
This curation of micro-experts is non-trivial. Not only are they more in number than any team of traditional experts, they need to be curated by the community for the model to be scalable. Quora, for example, creates new experts, largely relying on community voting.
As the network scales, it often finds it increasingly difficult to identify new experts as community sentiment tends to be biased towards early participants. Early users on Quora and Twitter tend to have orders of magnitude higher followers than those who joined in late, not only because they had more time, but also because:
The community’s power to curate depends on two aspects:
Every platform has its own way of building authority and/or trust. Ebay and AirBnB do it through ratings, Wikipedia through edit wars, Quora through votes. A network needs a fool-proof model for building participant authority to ensure that the right opinions are served for consumption.
However, as a network scales, trust and authority systems become more difficult to scale as well. It becomes much more difficult to identify the corner cases.
The systems that survive are the ones that scale. For every Reddit and Quora out there, there are a thousand attempts that gained traction but failed to scale because they failed at curation.
For all its efforts at scaling, Wikipedia successfully controls the quality of only the top 20% articles that lead to 80% views. As any platform scales, curation methods tend to work very effectively for the ‘Head’ but not for the long tail of user contributions. This runs the risk of long tail abuse. While it can be argued that the majority doesn’t get affected by such abuse, the minority that does get affected increases as the network scales and as the curation problem itself gets exacerbated.
In summary, appropriate quality controls are required to control production and appropriate filters are required to control consumption. And both these components need to scale as the network scales.
Reverse network effects: the challenges of scaling an online platform Share this
The challenge of scaling value creation and user behavior in online communities Share this
The quality issues that come with scaling networks and communities Share this
Lecture at the MIT Media Labs discussing platform economics and growth.