Archive for the ‘Uncategorized’ Category

Charlene Li & Josh Bernoff’s “Groundswell” – Ch. 7-12

May 6, 2009

            In the second half of Groundswell, Li & Bernoff continue explaining the rest of the strategies for groundswell engagement (energizing, supporting, and embracing), and finish off with a discussion of the effects of groundswell engagement on a company, how to implement it within a company, and the future of the groundswell.

            The third possible objective for groundswell engagement is energizing, which equates most closely with a traditional organization’s sales function but, again, has a much more impactful effect than basic sales.  Here the authors discussed three major approaches along with examples of each – 1) tapping into customer enthusiasm with ratings and reviews (good for retailers and others with direct customers, like the example of eBags), 2) creating a community to energize your customers (best if your customers have a common passion for your product and an affinity for each other, as with Constant Contact), and 3) participating in and energizing [already-existing] communities of your brand enthusiasts (like Lego).  Energized customers can often become like “unpaid R&D partners,” but be prepared to keep this relationship a 2-way one – they will expect to be listened to and responded to when appropriate.

            The fourth of the five groundswell engagement objectives is supporting, with corresponds directly to the normal customer support function, a normally quite expensive cost that can be significantly reduced when carried out using the groundswell (based on the fact that customers are often very willing and able to help other customers without any direct company intervention).  Three different specific approaches are highlighted in the book: 1) support forums (Dell, Linksys, CBS and its Jericho fans), 2) wikis (MIKE2.0 at Bearing Point, taxalmanac.com at Intuit, and ebaywiki.com at eBay), and 3) questions and answers (Naver and Yahoo!Answers).  The important tips mentioned for successfully supporting are: i) start small but plan for a larger presence, ii) reach out to your most active customers, iii) plan to drive traffic to your community (at least to get it started), and iv) build in a reputation system.  Again, be prepared for this to be a dialogue from which you can gain valuable insights for the company (at little to no cost).

            The last strategy discussed for groundswell engagement is embracing, similar to the typical corporate function of development, but accelerated by integrally involving all of your customers’ expertise into your development process (for both products and processes).  Three examples are recounted, including product development innovation by way of customer community involvement (Salesforce.com and Dell), process improvement from “crowdsourced” customer suggestions (Crédit Mutuel), and product improvements based on constant customer feedback (Loblaw grocery stores).  Embracing requires humility and transparency on the part of the implementing company but can reap significant rewards in return for the effort.

            Li and Bernoff go on to present case studies of Unilever and Dell to highlight the way that engaging with the groundswell, while often a slow, deliberate, and sometimes frustrating process which requires executive support and leadership, can help a company by making employees at all levels feel more engaged with their customer base.   Likewise, case studies on Best Buy, Avenue A/Razorfish, and Bell Canada were held up as examples of how engaging with the groundswell can be done internally to build much stronger employee relations in the same way that it can be implemented externally with customers.  They close with general tips on how broadly to engage with the groundswell: 

  • The groundswell is about person-to-person relationships, so be ready to engage with people.
  • Be a good listener.
  • Be patient.
  • Be opportunistic.
  • Be flexible.
  • Be collaborative.
  • Be humble. 

The lessons of Groundswell may be particularly beneficial in the current economic climate that businesses and consumers alike are facing.  The use of “social media marketing” is growing, and a recent article discussed its importance for retaining current customers at a time when finding new ones is exceedingly difficult.

Charlene Li & Josh Bernoff’s “Groundswell” – Ch. 1-6

May 6, 2009

            In the first half of Groundswell by Charlene Li and Josh Bernoff, the authors lay out the what and why of the groundswell, its component technologies, how to use it in a way geared towards your customer base based on the “Social Technographics Profile,” how to devise strategies for “tapping into” the groundswell based on the POST process (people, objectives, strategy, technology), and some of these strategies.

            They define the Groundswell as “a social trend in which people use technologies to get the things they need from each other, rather than from traditional institutions like corporations” and give numerous examples of the trend as well as past and present websites and applications that support it (from Napster, Rotten Tomatoes, Bit Torrent, Linux, and Craigslist to Facebook, MySpace, YouTube, Digg, del.icio.us, Wikipedia, etc.).  They explain that the groundswell is happening now as a result of the coincidence of three factors (people’s inherent desire to connect, the economics involved in the traffic generated by those connections, and the technological [Web 2.0] advances that have enabled so many more connections now than ever before) and that organizations ignore the groundswell at their peril, since ignoring it is like ignoring the customers who are using it (perhaps to the detriment of the organization).

            The authors go on to describe the big principle for understanding the groundswell (“concentrate on the relationships, not the technologies”) as well as its main technologies.  These technologies are categorizes as follows: 

  • People creating:  blogs, user-generated content, and podcasts
  • People connecting:  social networks and virtual worlds
  • People collaborating:  wikis and open source
  • People reacting to each other:  forums, ratings, and reviews
  • People organizing content:  tags
  • Accelerating consumption:  RSS and widgets 

They also list five key questions to ask in order to determine whether a new technology is likely to become a successful part of the groundswell: 

  • Does it enable people to connect with each other in new ways?
  • Is it effortless to sign up for?
  • Does it shift power from institutions to people?
  • Does the community generate enough content to sustain itself?
  • Is it an open platform that invites partnerships? 

Then they describe the six major categories of groundswell participants (in descending order of involvement: creators, critics, collectors, joiners, spectators, and inactives) as well as a variety of motivations for groundswell participation (maintaining friendships / social pressure, making new friends, affinity, paying it forward, altruism, prurient interest, creativity, and need for validation), stressing the importance of understanding what categories your customers fall into and which motivations drive their groundswell involvement.

            Li and Bernoff then discuss the steps necessary for a company or organization to properly engage with the groundswell, following the POST methodology:

 

  • People:  Who are your customers, and how do/will they engage with the groundswell?
  • Objectives:  What are your objectives for engaging with the groundswell (listening, talking, energizing, supporting, embracing)?
  • Strategy:  How do you want your customer relationships to change, and how will you measure success?
  • Technology:  Which groundswell technologies fit your strategy? 

These steps are followed by four general points to keep in mind for any groundswell approach: 

  • Create a plan that starts small but has room to grow.
  • Think through the consequences of your strategy.
  • Put somebody important in charge of it.
  • Use great care in selecting your technology and agency partners. 

            The first objective – listening – is compared with the standard corporate function of research, but is argued to be much more effective when done within the chatter of the groundswell (e.g., in blogs, forums, ratings sites, etc.).   Examples of the two main listening strategies – 1) setting up a private community (National Comprehensive Cancer Network) and 2) brand monitoring (BMW Mini) – are discussed.  Reasons for listening include: i) finding out what your customers think and say that your brand stands for, ii) understanding buzz trends, iii) saving research money while increasing its responsiveness, iv) finding sources of influence in your market, v) managing crises, and vi) generating new product and marketing ideas.  Results often include: a) a shift in the company power structure (towards the person with the data), b) a tendency to become addicted to short-term data and a need to balance it with long-term strategies, c) an inability to hide behind previously unquantified bad ideas, and d) a need to start responding to what you’re hearing.

            The second objective – talking – parallels with marketing in the standard corporate world; it has the advantage of allowing you to converse more personally with your customers and to have them converse with each otheroften at much lower cost (and the challenge of requiring engagement in a 2-way dialogue with customers rather than one-way at them).  Again, major talking strategies (four) are listed along with examples and tips for success – 1) posting a viral video to address awareness issues (Blendtec, Tibco), 2) engaging in social networks and user-generated content sites to increase word-of-mouth (Ernst & Young recruiting, Adidas, Pizza Hut, Motorola), 3) joining the blogosphere (especially for companies with complexity problems, like HP and Emerson Process Management), and 4) creating a community (especially for companies with hard-to-access customers, lie P&G’s feminine care group).

            While Groundswell does not hide its intention to promote Forrester Research, it does serve a very useful function for those (especially leaders and managers of businesses, not-for-profits, etc.) who are skeptical about the internet’s promise or even fearful of its implications for their particular business or interest.  The benefits of internet technology, Web 2.0, and the like often seem obvious from a consumer (i.e., “former audience”) perspective, but they can appear to come at the expense of the producer (i.e., the company or organization providing the product or service in question).  Groundswell shows that, on the contrary, businesses can truly seize on the new customer interactions made possible by the online world, not only in a separate additive way but also to boost already-existing relations in a multiplicative way.  Various blogs are starting to analyze the importance of this trend in social media marketing.

John Batelle’s “The Search” – Ch. 10-12, Eric Raymond’s “The Cathedral and the Bazaar,” and Tim O’Reilly’s “What is Web 2.0”

May 6, 2009

            The last three chapters of John Batelle’s “The Search” covers Google’s post-IPO phase and its development of a strategy to allow the company to grow from its then size of 3000 employees to as many as ten times that.  The leadership structure – with CEO Schmidt “[making] the trains run,” Page and Brin in charge of vision and product development, and Intuit founder Bill Campbell providing informal coaching – has long been a controversial one in the eyes of Wall Street (which seeks an unambiguous leadership structure) and some employees (who feel it’s impossible for anything to get done without Page and Brin’s direct approval).

The company, its founders, and CEO, who had thrived in an informal environment with tight central control for so long, knew that they needed to change their style as the company grew exponentially larger, managing ever more employees, users, shareholders, partners, and competitors, and they hired an ex-McKinsey consultant to help them lead the charge in that regard.

            Batelle then launches into an analysis of the approach to search by Google and their main competitor, Yahoo.  He points out how Google, started as a technology-driven company focused on using mathematical algorithms to solve the internet search question, is much more reticent to integrate commercial and editorial content into their search results, while Yahoo, started as a subjective collection of favorite links, openly touts the benefits of human involvement in tailoring search results to what are perceived as those most desired by their customers.  He argues that ultimately Google and Yahoo are both in what is primarily a media business and that search and commerce largely drive each other (since commerce involves people searching for products and services), implying that Google needs to become more comfortable with the commercial side of the search business.

            Batelle goes on to speculate on the future of Google’s innovation.  Based on conversations with CEO Schmidt, he believes the company is aiming to “provide a platform that mediates supply and demand for pretty much the entire world economy.”  In the pursuit of Google’s mission of “organizing the world’s information and making it universally accessible and useful,” he sees Google building on Microsoft’s accomplishment of “a computer on every desk,” the connection of every computer to every other computer by way of the internet, the proliferation of information (by way of GPS technology, RFID tags, UPC data, mobile text and video, etc.), and their (Google’s) own ability “to deliver hugely scaled services over the Web platform,” offering seemingly unlimited growth potential.

  1. The last piece of the puzzle will be a historic record of the internet so that when web pages are modified or deleted, previous versions won’t be lost.

            In March, Tom Simpson discussed a current effort aimed directly at this theoretical goal of perfect search in his blog Theconvergenceofeverything.com – Dr. Stephen Wolfram’s Wolfram Alpha.  The system’s unveiling at Harvard last week was discussed in The Independent on May 3.

           Various fundamental premises underlying the wildly successful growth of Google, search, the internet and the web in general are highlighted in the articles “The Cathedral and the Bazaar” by Eric Raymond and “What is Web 2.0” by Tim O’Reilly.  Raymond discusses the lessons of Linux and similar open-source software development projects (the “bazaar), which have shown that evolutionary programming, rapid prototyping (releasing early and often), delegating as much as possible, and being extremely open can actually work very effectively, as opposed to the more traditional hierarchical development model, with structured control of the process and planned release dates (the “cathedral”).  Many of these lessons have direct parallels to the successful growth of online media; for example:

  •  Every good work of software starts by scratching a developer’s personal itch.  /  To solve an interesting problem, start by finding a problem that is interesting to you.  Likewise, the best online applications usually address a direct need (e.g., Google itself).
  • Good programmers know what to write. Great ones know what to rewrite (and reuse).  /  Treating your users as co-developers is your least-hassle route to rapid code improvement and effective debugging.  /  Release early. Release often. And listen to your customers.  /  Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone.  /  If you treat your beta-testers as if they’re your most valuable resource, they will respond by becoming your most valuable resource.  /   The next best thing to having good ideas is recognizing good ideas from your users. Sometimes the latter is better.  Lots of parallels in API’s, the iterative nature of blogs, tweets, etc.
  • Often, the most striking and innovative solutions come from realizing that your concept of the problem was wrong.  Consider the demise of DEC as it pursued its hardware focus despite having the successful AltaVista search engine in the palm of its hand.
  • “Perfection (in design) is achieved not when there is nothing more to add, but rather when there is nothing more to take away.”  Like Google’s bare-bones search home page.
  • Any tool should be useful in the expected way, but a truly great tool lends itself to uses you never expected.  Again, API’s are a perfect example, like the Housingmaps.com mashup of Google and Craigslist.

 It’s interesting to note that parallels and analogies to Raymond’s analysis of the cathedral and the bazaar have been drawn by members of and commentators on various industries, from tech to healthcare to pharmaceuticals to the newspaper industry.

            O’Reilly’s article proposes 7 major principles that define Web 2.0, again with lots of parallels to the success of Google, search, and the broader internet and web: 

  • The web as platform.  E.g., Google is a service which exploits the web as its platform regardless of hardware, as opposed to Netscape, a product (a desktop application) which exploited the web too, but as a means rather than a platform.  Other examples given were Overture and AdSense vs. DoubleClick and BitTorrent vs. Akamai.
  • Harnessing collective intelligence.  E.g., hyperlinking, PageRank, Amazon reviews and suggestions, Wikipedia, tagging sites, blogs, RSS, etc.
  • Data is the next “Intel Inside.”  E.g., the valuable data sets owned by digital mapping companies, Amazon, etc.
  • End of the software release cycle.  Replaced by a need for constant iteration and a reliance on users as co-developers (as do Google, Flickr, etc.)
  • Lightweight programming models.  Syndication, “hackability,” and remixability over coordination and control (as used by Amazon, Google Maps, etc.).
  • Software above the level of a single device.  Much like iTunes and the iPod, mobile applications, etc.
  • Rich user experiences.  E.g., G-mail’s combination of e-mail and related applications based around database competencies, searchability, and universal accessibility. 

(These principles and the Web 2.0 concept have become ubiquitous – see the following Web 2.0 blog to see how the 7 principles can be restated in 50 different ways!)

            In short, we start to see a common theme emerging from multiple sources about what has brought Google and other successful web and new media companies the success they’ve had as well as what traits will help such companies to remain successful in the Web 2.0 era and beyond.  It all seems to come back to some of our most basic needs as human beings – to acquire knowledge, to master our domain, and to interconnect with our fellow man – and the companies that most effectively use new technology to do so will be the ultimate winners.

AdWords campaign results

May 6, 2009

My AdWords campaign ran the following ad:

     What Gnau?

     Thoughts on Media, Power

     and Politics

     whatgnau.wordpress.com

            As I suspected would be the case, the ad did not generate very much traffic, resulting in the following recommendation from Google: “Your ads are showing too rarely to spend your full budget. You may need to add keywords or raise your click price limit.”

            There were some interesting trends in the responses, however.  From 2/25/09 through 5/4/09, the overwhelming majority of impressions were associated with the words “politics” (39,440) and, oddly, “what” (41,224).  Next came “blog” with 1/100th  as many impressions (312), then another order of magnitude less impressions for “power” (33), “harvard kennedy school” and “media” (18 each), “nicco mele” (15), “kennedy school” (7), and “gnau” (4), with none for “kennedy school of government.”  There were also 3737 impressions on Google partner sites.

            Only the top two categories for impressions converted to any clicks, however.  Here the distribution made sense contextually, however, as 40 clicks stemmed from the impressions connected to “politics,” while only 3 were related to the word “what” (with 2 clicks coming from impressions on partner sites).

            Also of interest, the clicks related to the word “what” and from the partner sites were much more expensive ($0.14 each) than those related to the word “politics” ($0.056 each).

            The other trend of note was that the impressions were heavily concentrated within the first 10 days, plateaued at a lower level over the next month or so, and then dropped to negligible levels after that, reflecting that the algorithm for generating impressions seems to incorporate the low response rate over time, causing a downward spiral.

John Batelle’s “The Search” – Ch. 4-9

May 6, 2009

            Batelle spends the next six chapters covering Google from its inception (stemming from Larry Page’s interest in the linking structure of the Web and the difficulty of tracing links backwards – a problem solved with PageRank, which ranks search results based on the number of links into a sight as well as the number of links into the linking sites, based on the academic community’s concepts of peer review, citation, annotation, and rank) to its public offering and the dizzying growth process along the way.  Google outdid numerous competitors (in a field where people were determining on the fly what defined success) by being the first company to capture all three necessary ingredients “to profit from search and control its own destiny”: i) high-quality organic search results, ii) a paid search network (originally anathema to Google’s founders), and iii) its own traffic.

            Despite Google’s long-time aversion to advertising, it felt compelled to compromise, initially offering text-only ads (as opposed to large banner ads) on a cost-per-thousand (CPM) or “impressions” basis, based on the searcher’s keywords.  As that approach proved to be unsustainable and the economy soured, they switched to a “self-service” model called “Adwords,” where advertisers could pay for keyword-based ads directly online, and where they were charged based on a more successful cost-per-click (CPC) approach (as GoTo.com had been using).

            As their growth continued, especially following their replacement of Inktomi as Yahoo’s search engine, they wrestled with their approach to marketing.  Constrained by their budget, with lots of buzz being generated (thanks to the effectiveness of their search), they opted for an ultimately successful PR-centric approach over expensive paid media.  (Besides quality search results, they also offered a reliable system that seemed to tolerate any level of customer demand.  This was due to their distributed computing platform, a “massively parallel formation of cheap processing and storage” whose proprietary design became one of Google’s main assets.)

            2001 was a year of rapid growth enroute to a turning point for the company in 2002.  There was the selection of a CEO (Eric Schmidt), a challenging task given the requirements of the position – strong engineering skills, a tolerance of Page and Brin’s strong control, and strong leadership and management abilities.  Then there was the need to cope with the company’s accelerating growth and maintain the focus that Page and Brin had always had.  They were acquiring companies (Blogger, Picasa, etc.) and adding more data, tools, and services (Google News, phone book information, image search, mobile applications, etc.).  They also developed a company motto (“Don’t be evil” – a motto which became harder to follow as the business grew and confronted social conflicts such as how to do business with China) and a mission statement (ambitiously, “to organize the world’s information and make it universally accessible and useful”) and trimmed middle management in order to maintain more direct control and involvement with their engineers’ initiatives.

            In 2002, Google updated AdWords to use auction and pay-per-click features similar to Overture’s more successful approach, but with a twist: the ad’s popularity (clickthrough rate) was included in determining its overall ranking (as opposed to simply ranking them by the amount each company pays per click).  Soon thereafter, they made a critically important deal with AOL, whereby AOL would use Google’s search as well as Google’s paid listings (essentially “syndicating” AdWords).  (Recently, it has become apparent that this deal did not work out as well as Google had hoped, as they have decided to seek the return of what’s left of their AOL investment, though some speculate that this may be a tactic to yield a renegotiation of the terms or simply an effort to break with AOL before any deals with Yahoo and/or Microsoft are made).

           As Google’s success grew, so did a growing backlash against them and a fear that they were becoming the next Microsoft-like monopolistic behemoth.  But they pressed on into 2003, adding Froogle and an acquisition of Blogger and then a new chapter to their business and advertising model: AdSense, a service which placed contextually-relevant ads on any participating publishers’ websites, allowing them to monetize their web traffic and grow their advertiser network (considered to be their second-most important asset after their core infrastructure).

            Google constantly refines its search ranking methodology, in a constant struggle to outwit unscrupulous search engine optimizers; the issue of how they do it gained closer attention with one of their search algorithm updates in late 2003.  Such changes can and do severely impact the livelihoods of small internet retailers affected by them, and Google was being no more open about the rationale behind these decisions than they were about their financials.  There was serious speculation that some of their tweaks were being made in order to prevent advertisers from gaining top search rankings based solely on relevance and to pressure them into paying for AdWords.  Nonetheless, their paid search has evolved into a very successful business model, shifting marketing focus from “content attachment” to a much more efficient and potentially profitable “intent attachment.”  Batelle goes on to discuss how search has impacted various businesses and industries, from music (e.g., Napster) to news (e.g., declining subscriptions as news becomes a searchable commodity that is more easily sharable and interactive online) to local search/point-of-purchase applications.  He also discusses hazards of the search world that Google has faced and continues to grapple with, from fair handling of trademarked terms and other search term disputes to dealing with click fraud to privacy concerns (e.g., contextual ads placed in Gmail e-mails) to doing business in China without violating the company motto. 

             2004 was a true turning point for Google as it finally went public, in typical (non-traditional) Google fashion, using a Dutch auction IPO format and a dual-class shareholding structure (like many family-owned media companies) that would maintain strong company control in the hands of Page and Brin.  Despite a series of missteps in the months leading up to it, the IPO was resoundingly successful.

            The story of Page and Brin grappling with the growth of Google seems like a microcosm of society as a whole grappling with the growth of the internet and related technology innovations since the 1990’s.  The growth has had fantastic benefits for everyone, but it’s moving along so rapidly that it’s been hard to keep up and to manage its implications – good and bad.  For Page and Brin, their university project has turned into perhaps the fastest-growing business of all time, making them fabulously wealthy along the way; yet the growth has been so rapid that they’ve struggled to manage the complexity of the company, the competing demands of customers and advertisers, and the balance between loyalty to their principles and political-economic demands (e.g., doing business in China).  Many companies have tried and failed to grasp the internet wave (consider the DEC example mentioned in a previous chapter).  Consumers (Dan Gillmor’s “former audience”) have also benefitted greatly through an exponential increase in access to information, yet have wrestled with how to handle its downside – privacy concerns, internet scams, online pornography and predators, etc.  The resolution to this story, both for Google and for society, is probably many years and chapters away.

“The Search” by John Batelle — Chapters 1-3

April 20, 2009

In The Search, John Batelle does a great job of putting the concept of search into perspective.  He explains it largely against the backdrop of Google, but emphasizes that it is a bigger and more important concept than any one company.  He explains the importance of search through his self-coined metaphor of the “Database of Intentions,” which he defines as “the aggregate results of every search ever entered, every result list ever tendered, and every path taken as a result.”

            Indeed I think that the biggest thrill for everyone the first time they used the internet was the ability to search for any topic of interest to them and instantly find information about it that previously would have required a trip to the library, a call to a subject expert, or some other much more laborious, time-consuming, and potentially expensive method of inquiry.  Indeed, ordinary people might never have gained access to the endless sources of information that are now available with just a few keystrokes’ worth of effort.

            In introducing these ideas and the promise of search, Batelle also foreshadows some of search’s potential complications and downsides – namely the fact that the ability to track and record our personal “clickstreams,” which we once took for granted to be private and fleeting (and the incentive to do so from a commercial perspective, by providing desired products and services based on preferences inferred from those clickstreams) raises significant privacy concerns.  Interestingly, a company called Yauba is now marketing a privacy-sensitive search engine to people concerned with such privacy issues. 

            He then briefly surveys the basic mechanics of search – who does it; where; when; why, (i.e. for both “recovery of information we know and “discovery” of information that we do not);, how it works; and how much money is at stake (which turns out to be quite substantial).  He observes that, although most of the money generated by search so far has revolved around “matching text ads with the intent of a search query,” other methods are being developed that could accelerate the search-generated spending, from local search (which Google is now using even for generic [non-local] searches) to behavioral targeting to search personalization.

            Batelle recounts the development of the first effective search engine and the irony of the fact that it was developed by a hardware company (DEC) in an effort that was focused on selling more hardware, not on developing search capability.  Ironically, the company management didn’t recognize the larger potential of its AltaVista search engine, even as it became #1 in search in 1997 and was competing with AOL and Yahoo to be the most important Web destination.  Following DEC’s acquisition by Compaq, there was a change in focus to portal capability (competing with Excite and Yahoo) as opposed to search capability.  A few ill-timed and aborted IPO attempts later, AltaVista missed its moment and was overshadowed by Google.

            Other companies made similarly notable contributions to the development of search as well as similar corporate missteps that led them off the path of search dominance – from Lycos (which first incorporated links to a website as a measure of relevance, as well as the use of not just links but also web page summaries in search results) to Excite (which first offered web page personalization and customization and free e-mail).

            On the other hand, Yahoo gained its foothold based on a hierarchical approach to organizing information on the web – organizing into categories and subcategories, a passive approach that made sense before people were fluent with the web and came to it proactively seeking information.  Like many others, Yahoo focused its efforts on being a portal for web traffic, only gradually incorporating search capabilities, leaving the way clear for someone – ultimately Google – to come along and focus on search.

            Ironically, what all of these companies have done and continue to do is in itself a search – a search for how best to search .  Google and their competitors are trying to make our search for information better, and to do so they are trying to mimic the way we think.  Understanding how we think will allow them to anticipate how we search and what we’re searchng for, which in the end is what the promise of the internet is all about.  This has been the internet’s appeal since the beginning – the ability to search, to find the information we need, when we need it, without ever leaving the comfort of our own home or office.

We the Media – Part II

February 25, 2009

            While I am hopeful about the power of the internet and its promise to level the playing field of knowledge, I am also a bit skeptical about it in some regards.  My skepticism lies in the tremendous amount of useless information sharing and time-wasting that blogs, messaging, and other such technologies have spawned.  (These, at least, are relatively benign negatives.  There are of course other more harmful and malicious by-products as well, from the prevalence of internet porn to the facilitation of child predation to the incredible breach of privacy that impacts pretty much everyone – internet users and non-users alike.  But that’s a whole other topic.)

            Of course the benefits of productive information sharing and of empowering citizen journalism probably far outweigh the costs, and I am encouraged to see the positive and productive effects that are possible.  Politics is an obvious example.  George Allen may not be as impressed by the power of internet technology as some others, but his opponents were certainly happy to capitalize on it after his “macaca” comment during a Senate campaign stop in August 2006, in what Rolling Stone called  “the first YouTube election.”  His comments to a Webb staffer of Indian descent who was videotaping him were quickly posted on YouTube (http://www.youtube.com/watch?v=r90z0PMnKwI), and the topic spread like wildfire through the blog world (http://news.cnet.com/8301-10784_3-6107987-7.html, http://www.politicstv.com/blog/?p=1239, http://www.brendan-nyhan.com/blog/2006/08/new_postmacaca_.html), forcing an initially reluctant big media to pick up the story as well.  Thus ended Allen’s hopes not only for the Virginia Senate race, but for a possible presidential run in 2008.  On the flipside, of course, an understanding of new web and media technologies and effective utilization of them were a key factor in Barack Obama’s success in his 2008 presidential run.  Though relatively unknown at the beginning of the race, in a field of heavy-hitter candidates – among whom Hillary Clinton was presumed to be the automatic front-runner – Obama managed to connect with voters, get out his message, and raise funds in a way that his opponents simply couldn’t compete with.

            Perhaps the most impressive grassroots effort I’m aware of that owes its success to the use of digital media was the “No Más FARC” movement in early 2008.  Four young Colombians came up with the idea to stage a protest march against the terrorist guerrilla group FARC to show, despite FARC claims of representing the people, that the people of Colombia do NOT support the FARC.  Within a month of creating their Facebook page for “Un Millón de Voces Contra Las FARC” (“A Million Voices Against FARC”), they had signed up hundreds of thousands of members around the world, leading to successful public rallies in cities across the planet.  Estimates of turnout ranged as high as 10,000,000 in Colombia alone (roughly a quarter of the country’s total population!).

            One of the more useful topics that Gillmor discusses in We the Media is the wiki phenomenon.  Of course most regular internet users are familiar with Wikipedia, but the WikiTravel site was a new one to me and one that I found to be surprisingly useful.  I checked its content on traveling to Colombia (where I lived and worked for two years and thus had some inside perspective) and found it to be much more useful and insightful than the average travel guide one might find in a bookstore.  It truly reflected the knowledge of people who are familiar with the country more deeply than simply as tourists.

            As for the issue of sifting through the glut of useless information on the web to get to what’s really valuable, I think the most promising potential lies in the refinement of tools like RSS, aggregators, and newsreaders, as well as the development and refinement of recommendation and reputation tools, as Gillmor discusses, to assist in judging the value of any particular web content and the like, that will help (impatient) people (like me) to separate the wheat from the chaff.

“We The Media”

February 23, 2009

          Dan Gillmor’s “We the Media” attempts to describe broadly how technology is impacting journalism, not only in terms of the journalists themselves but also in terms of the subjects they write about as well as their “former audience,” which he says has now become part audience and part journalist.  His highlights the changes caused by the internet and all of its related aspects and technologies – mail lists, forums, blogs, wikis, SMS, RSS, P2P sharing, etc. – and discusses how the internet has broadened the traditional media communication modes of “many-to-one” and “one-to-one” to include “many-to-many” and “few-to-few.”

            Gillmor feels that the best way for companies to take advantage of the new technologies that some of them view as a threat to their established business model and way of life is to embrace them.  He discusses the effectiveness of companies that scan the blog world for criticisms but then capitalize on those criticisms by addressing them and responding to them, recognizing that they can improve as a company by incorporating the good ideas of their users, supporters, and even detractors.

            Likewise for anyone else in the public eye, not the least of whom are politicians.  He shows how use of the internet in politics, after some initial successes by John McCain in 2000 and various other lower-level politicians, took off with the Howard Dean campaign in 2004, taking a Washington outsider and virtual unknown (at least on a national stage) to what it seemed was going to be his party’s nomination, before some political missteps stopped his momentum.  Gillmor accurately predicted that the internet would play a much more central role in the 2008 election cycle, though even he would probably have been surprised to know just how big it would actually be.

            I must say that I was hoping for a more conclusive “a-ha” kind of takeaway from the book and haven’t really found it, but that is not necessarily a criticism – I think the book simply presents the good and the bad of these technologies and leaves it to the reader to draw their own conclusions.  The author is clearly on the side of technology and the interests of the “former audience” over the interests of a handful of powerful media conglomerates, as, probably, are most people not inside the world of those media giants.

            While I certainly believe that the internet does redefine the nature of journalism and news, I think it is a most definitely a supplement to the contributions of professional journalists and media organizations, not a replacement for them.  It will be difficult for individuals to muster the resources of a major media organization when it comes to covering big investigative stories (like Watergate, Iran Contra, etc.), and reporting of this nature is a full-time occupation (i.e., one that requires a full-time commitment and probably a salary, unless there are a lot of independently wealthy bloggers out there willing to pick up the slack).  I think the key for media company success in this area will be to find the right mix of old-fashioned big media and cutting-edge interactive journalism.  I think it’s too early to tell what this mix will be, however, and most of the efforts so far seem like knee-jerk reactions to incorporate the ramblings of every Tom, Dick, and Harry with cell phone messaging capability or a Twitter account (case in point, http://twitter.com/ricksanchezcnn or http://www.cnn.com/ireport ).  These are just new-technology versions of man-on-the-street interviews, most of which are equally unenlightening.  This is not to say there’s no place for these contributions – they can be great for eyewitness reports of breaking news stories, for example – but to make them a routine and mandatory part of a news broadcast simply for the sake of paying homage to the medium is just excessive.