case study question

case study question

1.- Case Study Questions Should managers monitor employee e-mail and Internet usage? Why or why not?

2- Describe an effective e-mail and Web use policy for a company.

3- Should managers inform employees that their Web behavior is being monitored? Or should managers monitor secretly? Why or why not?

READING:

Interactive Session: Organizations The Battle Over Net Neutrality

What kind of Internet user are you? Do you primarily use the Net to do a little e-mail and online banking? Or are you online all day, watching YouTube videos, downloading music files, or playing online games? Do you use your iPhone to stream TV shows and movies on a regular basis? If you’re a power Internet or smartphone user, you are consuming a great deal of bandwidth. Could hundreds of millions of people like you start to slow the Internet down? Video streaming on Netflix accounts for 32 percent of all bandwidth use in the United States, and Google’s YouTube for 19 percent of Web traffic at peak hours. If user demand overwhelms network capacity, the Internet might not come to a screeching halt, but users could face sluggish download speeds and video transmission. Heavy use of iPhones in urban areas such as New York and San Francisco has already degraded service on the AT&T wireless network. AT&T reported that 3 percent of its subscriber base accounted for 40 percent of its data traffic. Internet service providers (ISPs) assert that network congestion is a serious problem and that expanding their networks would require passing on burdensome costs to consumers. These companies believe differential pricing methods, which include data caps and metered use—charging based on the amount of bandwidth consumed—are the fairest way to finance necessary investments in their network infrastructures. But metering Internet use is not widely accepted, because of an ongoing debate about net neutrality. Net neutrality is the idea that Internet service providers must allow customers equal access to content and applications, regardless of the source or nature of the content. Presently, the Internet is neutral: all Internet traffic is treated equally on a first-come, first-served basis by Internet backbone owners. However, this arrangement prevents telecommunications and cable companies from charging differentiated prices based on the amount of bandwidth consumed by the content being delivered over the Internet. The strange alliance of net neutrality advocates includes MoveOn.org; the Christian Coalition; the American Library Association; data-intensive Web businesses such as Netflix, Amazon, and Google; major consumer groups; and a host of bloggers and small businesses. Net neutrality advocates argue that differentiated pricing would impose heavy costs on heavy bandwidth users such as YouTube, Skype, and other innovative services, preventing high-bandwidth startup companies from gaining traction. Net neutrality supporters also argue that without net neutrality, ISPs that are also cable companies, such as Comcast, might block online streaming video from Netflix or Hulu in order to force customers to use the cable company’s on-demand movie rental services. Network owners believe regulation to enforce net neutrality will impede U.S. competitiveness by discouraging capital expenditure for new networks and curbing their networks’ ability to cope with the exploding demand for Internet and wireless traffic. U.S. Internet service lags behind many other nations in overall speed, cost, and quality of service, adding credibility to this argument. And with enough options for Internet access, dissatisfied consumers could simply switch to providers who enforce net neutrality and allow unlimited Internet use. The wireless industry had been largely exempted from net neutrality rules, because the government determined it was a less mature network and companies should be allowed more freedom to manage traffic. Wireless providers already have tiered plans that charge heavy bandwidth users larger service fees. A December 2012 report by the non-profit, nonpartisan, public policy institute, New America Foundation (NAF), disputes these claims. Like personal computers, the processing capacity of the routers and switches in wired broadband networks has vastly expanded while the price has declined. Although total U.S. Internet data consumption rose 120% in 2012, the cost to transport the data decreased at a faster pace. The net cost to carriers was at worst flat and for the most part, down. The NAF report further asserts that lack of competition has enabled wired broadband carriers to charge higher rates, institute data caps, and spend less on the capital expenditures needed to upgrade and maintain their networks than they have in the past. The courts have maintained that the Federal Communications Commission (FCC) has no authority to dictate how the Internet operates. The Communications Act of 1996 forbids the agency from managing the Internet as a “common carrier,” the regulatory approach the commission took toward telephones, and the FCC itself decided not to classify broadband as a telecommunications service. On January 14, 2014, the U.S. Court of Appeals for the District of Columbia struck down the FCC’s “Open Internet” rules that required equal treatment of Internet traffic and prevented broadband providers from blocking traffic favoring certain sites or charging special fees to companies that account for the most traffic. The court said the FCC saddled broadband providers with the same sorts of obligations as traditional “common carrier” telecommunications services, such as landline phone systems, even though the commission had explicitly decided not to classify broadband as a telecommunications service. On April 24, 2014, the FCC announced that it would propose new rules that allow companies like Disney, Google or Netflix to pay Internet service providers like Comcast and Verizon for special, faster lanes to send video and other content to their customers. Broadband providers would have to disclose how they treat all Internet traffic and on what terms they offer more rapid lanes, and would be required to act in a “commercially reasonable man ner.” Providers would not be allowed to block Web sites. The proposed rules would also require Internet service providers to disclose whether, in assigning faster lanes, they had favored their affiliated companies that provide content. Nevertheless, the FCC continues to push for an open Internet. On April 30, 2014, FCC chairman Tom Wheeler announced that lack of competition has hurt consumers, and that the FCC planned to write tough new rules to enforce net neutrality. Sources: “Should the U.S. Regulate Broadband Internet Access as a Utility?” Wall Street Journal, May 11, 2014; Edward Wyatt, “Stern Talk From Chief of F.C.C. on Open Net,” New York Times, April 30, 2014 and “F.C.C., in a Shift, Backs Fast Lane for Web Traffic,” New York Times, April 24, 2014; Amol Sharma, “Netflix, YouTube Could Feel Effects of ‘Open Internet’ Ruling,” Wall Street Journal, January 14, 2014; Gautham Nagesh, “FCC to Propose New ‘Net Neutrality’ Rules,” Wall Street Journal, April 23, 2014; Shira Ovide, “Moving Beyond the Net Neutrality Debate,” Wall Street Journal, January 14, 2014; Gautham Nagesh and Amol Sharma, “Court Tosses Rules of Road for Internet,” Wall Street Journal, January 4, 2014; UpdAlina Selyukh,” S. Court to Hear Oral Arguments in Net Neutrality Case on September 9,” Reuters, June 25, 2013; and Hibah Hussain, Danielle Kehl, Benjamin Lennett, and Patrick Lucey, “Capping the Nation’s Broadband Future? Dwindling Competition Is Fueling the Rise of Increasingly Costly and Restrictive Internet Usage Caps,” New America Foundation, December 17, 2012. Case Study Questions What is net neutrality? Why has the Internet operated under net neutrality up to this point in time? Who’s in favor of net neutrality? Who’s opposed? Why? What would be the impact on individual users, businesses, and government if Internet providers switched to a tiered service model for transmission over land lines as well as wireless? It has been said that net neutrality is the most important issue facing the Internet since the advent of the Internet. Discuss the implications of this statement. Are you in favor of legislation enforcing network neutrality? Why or why not? The Future Internet: IPv6 and Internet2 The Internet was not originally designed to handle the transmission of massive quantities of data and billions of users. Because of sheer Internet population growth, the world is about to run out of available IP addresses using the old addressing convention. The old addressing system is being replaced by a new version of the IP addressing schema called IPv6 (Internet Protocol version 6), which contains 128-bit addresses (2 to the power of 128), or more than a quadrillion possible unique addresses. IPv6 is not compatible with the existing Internet addressing system, so the transition to the new standard will take years. Internet2 is an advanced networking consortium representing over 350 U.S. universities, private businesses, and government agencies working with 66,000 institutions across the United States and international networking partners from more than 100 countries. To connect these communities, Internet2 developed a high-capacity 100 Gbps network that serves as a testbed for leading-edge technologies that may eventually migrate to the public Internet, including telemedicine, distance learning, and other advanced applications not possible with consumer-grade Internet services. The fourth generation of this network is being rolled out to provide 8.8 terabits of capacity. Internet Services and Communication Tools The Internet is based on client/server technology. Individuals using the Internet control what they do through client applications on their computers, such as Web browser software. The data, including e-mail messages and Web pages, are stored on servers. A client uses the Internet to request information from a particular Web server on a distant computer, and the server sends the requested information back to the client over the Internet. Chapters 5 and 6 describe how Web servers work with application servers and database servers to access information from an organization’s internal information systems applications and their associated databases. Client platforms today include not only PCs and other computers but also smartphones and tablets. Internet Services A client computer connecting to the Internet has access to a variety of services. These services include e-mail, chatting and instant messaging, electronic discussion groups, Telnet, File Transfer Protocol (FTP), and the Web. Table 7.3 provides a brief description of these services. Each Internet service is implemented by one or more software programs. All of the services may run on a single server computer, or different services may be allocated to different machines. Figure 7.8 illustrates one way that these services can be arranged in a multitiered client/server architecture. E-mail enables messages to be exchanged from computer to computer, with capabilities for routing messages to multiple recipients, forwarding messages, and attaching text documents or multimedia files to messages. Most e-mail today is sent through the Internet. The cost of e-mail is far lower than equivalent voice, postal, or overnight delivery costs, and e-mail messages arrive anywhere in the world in a matter of seconds. Nearly 90 percent of U.S. workplaces have employees communicating interactively using chat or instant messaging tools. Chatting enables two or more people who are simultaneously connected to the Internet to hold live, interactive conversations. Chat systems now support voice and video chat as well as written conversations. Many online retail businesses offer chat services on their Web sites to attract visitors, to encourage repeat purchases, and to improve customer service. Table 7.3 Major Internet Services Capability Functions Supported E-mail Person-to-person messaging; document sharing Chatting and instant messaging Interactive conversations Newsgroups Discussion groups on electronic bulletin boards Telnet Logging on to one computer system and doing work on another File Transfer Protocol (FTP) Transferring files from computer to computer World Wide Web Retrieving, formatting, and displaying information (including text, audio, graphics, and video) using hypertext links Instant messaging is a type of chat service that enables participants to create their own private chat channels. The instant messaging system alerts the user whenever someone on his or her private list is online so that the user can initiate a chat session with other individuals. Instant messaging systems for consumers include Yahoo! Messenger, Google Talk, AOL Instant Messenger, and Facebook Chat. Companies concerned with security use proprietary communications and messaging systems such as IBM Sametime. Newsgroups are worldwide discussion groups posted on Internet electronic bulletin boards on which people share information and ideas on a defined topic, such as radiology or rock bands. Anyone can post messages on these bulletin boards for others to read. Many thousands of groups exist that discuss almost all conceivable topics. Figure 7.8 Client/Server Computing on the Internet Client computers running Web browsers and other software can access an array of services on servers over the Internet. These services may all run on a single server or on multiple specialized servers. Employee use of e-mail, instant messaging, and the Internet is supposed to increase worker productivity, but the accompanying Interactive Session on Management shows that this may not always be the case. Many company managers now believe they need to monitor and even regulate their employees’ online activity. But is this ethical? Although there are some strong business reasons why companies may need to monitor their employees’ e-mail and Web activities, what does this mean for employee privacy? Voice over IP The Internet has also become a popular platform for voice transmission and corporate networking. Voice over IP (VoIP) technology delivers voice information in digital form using packet switching, avoiding the tolls charged by local and long-distance telephone networks (see Figure 7.9). Calls that would ordinarily be transmitted over public telephone networks travel over the corporate network based on the Internet Protocol, or the public Internet. Voice calls can be made and received with a computer equipped with a microphone and speakers or with a VoIP-enabled telephone. Figure 7.9 How Voice over IP Works A VoIP phone call digitizes and breaks up a voice message into data packets that may travel along different routes before being reassembled at the final destination. A processor nearest the call’s destination, called a gateway, arranges the packets in the proper order and directs them to the telephone number of the receiver or the IP address of the receiving computer. Cable firms such as Time Warner and Cablevision provide VoIP service bundled with their high-speed Internet and cable offerings. Skype offers free VoIP worldwide using a peer-to-peer network, and Google has its own free VoIP service. Although there are up-front investments required for an IP phone system, VoIP can reduce communication and network management costs by 20 to 30 percent. For example, VoIP saves Virgin Entertainment Group $700,000 per year in long-distance bills. In addition to lowering long-distance costs and eliminating monthly fees for private lines, an IP network provides a single voice-data infrastructure for both telecommunications and computing services. Companies no longer have to maintain separate networks or provide support services and personnel for each different type of network. Unified Communications In the past, each of the firm’s networks for wired and wireless data, voice communications, and videoconferencing operated independently of each other and had to be managed separately by the information systems department. Now, however, firms are able to merge disparate communications modes into a single universally accessible service using unified communications technology. Unified communications integrates disparate channels for voice communications, data communications, instant messaging, e-mail, and electronic conferencing into a single experience where users can seamlessly switch back and forth between different communication modes. Presence technology shows whether a person is available to receive a call. Companies will need to examine how work flows and business processes will be altered by this technology in order to gauge its value. Interactive Session: Management Monitoring Employees on Networks: Unethical or Good Business? The Internet has become an extremely valuable business tool, but it’s also a huge distraction for workers on the job. Employees are wasting valuable company time by surfing inappropriate Web sites (Facebook, shopping, sports, etc.), sending and receiving personal email, talking to friends via online chat, and downloading videos and music. According to IT research firm Gartner Inc., non-work-related Internet surfing results in an estimated 40% productivity loss each year for American businesses. A recent Gallup Poll found that the average employee spends over 75 minutes per day using office computers for non-business related activity. That translates into an annual loss of $6,250 per year, per employee. An average mid-size company of 500 employees could be expected to lose $3.25 million in lost productivity due to Internet misuse. Many companies have begun monitoring employee use of e-mail and the Internet, sometimes without their knowledge. Many tools are now available for this purpose, including SONAR, Spector CNE Investigator, iSafe, OsMonitor, IMonitor, Work Examiner, Net Spy, Activity Monitor, Mobistealth, and Spytech. These products enable companies to record online searches, monitor file downloads and uploads, record keystrokes, keep tabs on emails, create transcripts of chats, or take certain screenshots of images displayed on computer screens. Instant messaging, text messaging, and social media monitoring are also increasing. Although U.S. companies have the legal right to monitor employee Internet and e-mail activity while they are at work, is such monitoring unethical, or is it simply good business? Managers worry about the loss of time and employee productivity when employees are focusing on personal rather than company business. Too much time on personal business translates into lost revenue. Some employees may even be billing time they spend pursuing personal interests online to clients, thus overcharging them. If personal traffic on company networks is too high, it can also clog the company’s network so that legitimate business work cannot be performed. Procter & Gamble (P&G) found that on an average day, employees were listening to 4,000 hours of music on Pandora and viewing 50,000 five-minute YouTube videos. These activities involved streaming huge quantities of data, which slowed down P&G’s Internet connection. When employees use e-mail or the Web (including social networks) at employer facilities or with employer equipment, anything they do, including anything illegal, carries the company’s name. Therefore, the employer can be traced and held liable. Management in many firms fear that racist, sexually explicit, or other potentially offensive material accessed or traded by their employees could result in adverse publicity and even lawsuits for the firm. An estimated 27 percent of Fortune 500 organizations have had to defend themselves against claims of sexual harassment stemming from inappropriate email. Even if the company is found not to be liable, responding to lawsuits could run up huge legal bills. Symantec’s 2011 Social Media Protection Flash Poll found that the average litigation cost for companies with social media incidents ran over $650,000. Companies also fear leakage of confidential information and trade secrets through e-mail or social networks. Another survey conducted by the American Management Association and the ePolicy Institute found that 14 percent of the employees polled admitted they had sent confidential or potentially embarrassing company e-mails to outsiders. U.S. companies have the legal right to monitor what employees are doing with company equipment during business hours. The question is whether electronic surveillance is an appropriate tool for maintaining an efficient and positive workplace. Some companies try to ban all personal activities on corporate networks—zero tolerance. Others block employee access to specific Web sites or social sites, closely monitor e-mail messages, or limit personal time on the Web. For example, P&G blocks Netflix and has asked employees to limit their use of Pandora. It still allows some YouTube viewing, and is not blocking access to social networking sites because staff use them for digital marketing campaigns. Ajax Boiler in Santa Ana, California, uses software from SpectorSoft Corporation that records all the Web sites employees visit, time spent at each site, and all e-mails sent. Financial services and investment firm Wedbush Securities monitors the daily e-mails, instant messaging, and social networking activity of its 1,000-plus employees. The firm’s e-mail monitoring software flags certain types of messages and keywords within messages for further investigation. A number of firms have fired employees who have stepped out of bounds. A Proofpoint survey found that one in five large U.S. companies fired an employee for violating e-mail policies in the past year. Among managers who fired employees for Internet misuse, the majority did so because the employees’ e-mail contained sensitive, confidential, or embarrassing information. No solution is problem-free, but many consultants believe companies should write corporate policies on employee e-mail, social media, and Web use. The policies should include explicit ground rules that state, by position or level, under what circumstances employees can use company facilities for e-mail, blogging, or Web surfing. The policies should also inform employees whether these activities are monitored and explain why. IBM now has “social computing guidelines” that cover employee activity on sites such as Facebook and Twitter. The guidelines urge employees not to conceal their identities, to remember that they are personally responsible for what they publish, and to refrain from discussing controversial topics that are not related to their IBM role. The rules should be tailored to specific business needs and organizational cultures. For example, investment firms will need to allow many of their employees access to other investment sites. A company dependent on widespread information sharing, innovation, and independence could very well find that monitoring creates more problems than it solves. Sources: “Should Companies Monitor Their Employees’ Social Media?” Wall Street Journal, May 11, 2014; Rhodri Marsden, “Workplace monitoring mania may be risky business,” Brisbane Times, March 30, 2014; Donna Iadipaolo, “Invading Your Privacy Is Now the Norm in the Workplace,” Philly.com, April 28, 2014; “Office Slacker Stats,” www.staffmonitoring.com, accessed May 1, 2014; “Office Productivity Loss,” Staffmonitoring.com, accessed May 1, 2014; “Workplace Privacy and Employee Monitoring,” Privacy Rights Clearinghouse, June 2013; Samuel Greengard, “How Smartphone Addiction Hurts Productivity,” CIO Insight, March 11, 2013; Emily Glazer, “P&G Curbs Employees’ Internet Use,” The Wall Street Journal, April 4, 2012; and David L. Barron, “Social Media: Frontier for Employee Disputes,” Baseline, January 19, 2012. Case Study Questions Should managers monitor employee e-mail and Internet usage? Why or why not? Describe an effective e-mail and Web use policy for a company. Should managers inform employees that their Web behavior is being monitored? Or should managers monitor secretly? Why or why not? CenterPoint Properties, a major Chicago area industrial real estate company, used unified communications technology to create collaborative Web sites for each of its real estate deals. Each Web site provides a single point for accessing structured and unstructured data. Integrated presence technology lets team members e-mail, instant message, call, or videoconference with one click. Virtual Private Networks What if you had a marketing group charged with developing new products and services for your firm with members spread across the United States? You would want them to be able to e-mail each other and communicate with the home office without any chance that outsiders could intercept the communications. In the past, one answer to this problem was to work with large private networking firms who offered secure, private, dedicated networks to customers. But this was an expensive solution. A much less-expensive solution is to create a virtual private network within the public Internet. A virtual private network (VPN) is a secure, encrypted, private network that has been configured within a public network to take advantage of the economies of scale and management facilities of large networks, such as the Internet (see Figure 7.10). A VPN provides your firm with secure, encrypted communications at a much lower cost than the same capabilities offered by traditional non-Internet providers who use their private networks to secure communications. VPNs also provide a network infrastructure for combining voice and data networks. Figure 7.10 A Virtual Private Network Using the Internet This VPN is a private network of computers linked using a secure “tunnel” connection over the Internet. It protects data transmitted over the public Internet by encoding the data and “wrapping” them within the Internet Protocol (IP). By adding a wrapper around a network message to hide its content, organizations can create a private connection that travels through the public Internet. Several competing protocols are used to protect data transmitted over the public Internet, including Point-to-Point Tunneling Protocol (PPTP). In a process called tunneling, packets of data are encrypted and wrapped inside IP packets. By adding this wrapper around a network message to hide its content, business firms create a private connection that travels through the public Internet. The Web The Web is the most popular Internet service. It’s a system with universally accepted standards for storing, retrieving, formatting, and displaying information using a client/server architecture. Web pages are formatted using hypertext with embedded links that connect documents to one another and that also link pages to other objects, such as sound, video, or animation files. When you click a graphic and a video clip plays, you have clicked a hyperlink. A typical Web site is a collection of Web pages linked to a home page. Hypertext Web pages are based on a standard Hypertext Markup Language (HTML), which formats documents and incorporates dynamic links to other documents and pictures stored in the same or remote computers (see Chapter 5). Web pages are accessible through the Internet because Web browser software operating your computer can request Web pages stored on an Internet host server using the Hypertext Transfer Protocol (HTTP). HTTP is the communications standard used to transfer pages on the Web. For example, when you type a Web address in your browser, such as http://www.sec.gov, your browser sends an HTTP request to the sec.gov server requesting the home page of sec.gov. HTTP is the first set of letters at the start of every Web address, followed by the domain name, which specifies the organization’s server computer that is storing the document. Most companies have a domain name that is the same as or closely related to their official corporate name. The directory path and document name are two more pieces of information within the Web address that help the browser track down the requested page. Together, the address is called a uniform resource locator (URL). When typed into a browser, a URL tells the browser software exactly where to look for the information. For example, in the URL http://www.megacorp.com/content/features/082610.html, http names the protocol used to display Web pages, www.megacorp.com is the domain name, content/features is the directory path that identifies where on the domain Web server the page is stored, and 082610.html is the document name and the name of the format it is in (it is an HTML page). Web Servers A Web server is software for locating and managing stored Web pages. It locates the Web pages requested by a user on the computer where they are stored and delivers the Web pages to the user’s computer. Server applications usually run on dedicated computers, although they can all reside on a single computer in small organizations. The most common Web server in use today is Apache HTTP Server, followed by Microsoft Internet Information Services (IIS). Apache is an open source product that is free of charge and can be downloaded from the Web. Searching for Information on the Web No one knows for sure how many Web pages there really are. The surface Web is the part of the Web that search engines visit and about which information is recorded. For instance, Google visited an estimated 600 billion pages in 2013, and this reflects a large portion of the publicly accessible Web page population. But there is a “deep Web” that contains an estimated 1 trillion additional pages, many of them proprietary (such as the pages of the Wall Street Journal Online, which cannot be visited without a subscription or access code) or that are stored in protected corporate databases. Searching for information on Facebook is another matter. With an estimated 1.3 billion members, each with pages of text, photos, and media, the population of Web pages is larger than many estimates. But Facebook is a “closed” Web, and its pages are not searchable by Google or other search engines. Search Engines Obviously, with so many Web pages, finding specific Web pages that can help you or your business, nearly instantly, is an important problem. The question is, how can you find the one or two pages you really want and need out of billions of indexed Web pages? Search engines attempt to solve the problem of finding useful information on the Web nearly instantly, and, arguably, they are the “killer app” of the Internet era. Today’s search engines can sift through HTML files, files of Microsoft Office applications, PDF files, as well as audio, video, and image files. There are hundreds of different search engines in the world, but the vast majority of search results are supplied by Google, Yahoo!, and Microsoft’s Bing (see Figure 7.11). Figure 7.11 Top Web Search Engines in the United States Google is the most popular search engine, handling nearly 70 percent of Web searches in the United States and around 90% in Europe. Sources: Based on data from comScore Inc., February 2014. Web search engines started out in the early 1990s as relatively simple software programs that roamed the nascent Web, visiting pages and gathering information about the content of each page. The first search engines were simple keyword indexes of all the pages they visited, leaving the user with lists of pages that may not have been truly relevant to their search. In 1994, Stanford University computer science students David Filo and Jerry Yang created a hand-selected list of their favorite Web pages and called it “Yet Another Hierarchical Officious Oracle,” or Yahoo. Yahoo was not initially a search engine but rather an edited selection of Web sites organized by categories the editors found useful. Currently Yahoo relies on Microsoft’s Bing for search results. In 1998, Larry Page and Sergey Brin, two other Stanford computer science students, released their first version of Google. This search engine was different: Not only did it index each Web page’s words but it also ranked search results based on the relevance of each page. Page patented the idea of a page ranking system (called PageRank System), which essentially measures the popularity of a Web page by calculating the number of sites that link to that page as well as the number of pages to which it links. The premise is that really popular Web pages are more “relevant” to users. Brin contributed a unique Web crawler program that indexed not only keywords on a page but also combinations of words (such as authors and the titles of their articles). These two ideas became the foundation for the Google search engine. Figure 7.12 illustrates how Google works. Mobile Search With the growth of mobile smartphones and tablet computers, and with about 167 million Americans accessing the Internet via mobile devices, the nature of e-commerce and search is changing. Mobile search from smartphones and tablets made up about 50 percent of all searches in 2014, and according to Google will expand rapidly in the next few years. Both Google and Yahoo have developed new search interfaces to make searching and shopping from smartphones more convenient. Amazon, for instance, sold over $1 billion in goods in 2013 through mobile searches of its store (Search Agency, 2013). While smartphones are widely used to shop, actual purchases typically take place on laptops or desktops, followed by tablets. Figure 7.12 How Google Works The Google search engine is continuously crawling the Web, indexing the content of each page, calculating its popularity, and storing the pages so that it can respond quickly to user requests to see a page. The entire process takes about one-half second. Search Engine Marketing Search engines have become major advertising platforms and shopping tools by offering what is now called search engine marketing. Searching for information is one of the Web’s most popular activities: 60% of American adult Internet users use a search engine at least once a day, generating about 90 billion queries a month. With this huge audience, search engines are the foundation for the most lucrative form of online marketing and advertising, search engine marketing. When users enter a search term at Google, Bing, Yahoo, or any of the other sites serviced by these search engines, they receive two types of listings: sponsored links, for which advertisers have paid to be listed (usually at the top of the search results page), and unsponsored “organic” search results. In addition, advertisers can purchase small text boxes on the side of search results pages. The paid, sponsored advertisements are the fastest growing form of Internet advertising and are powerful new marketing tools that precisely match consumer interests with advertising messages at the right moment. Search engine marketing monetizes the value of the search process. In 2014, search engine marketing is expected to generate $22.8 billion in revenue, nearly half of all online advertising ($51 billion). Google will for over 38% of all online advertising in 2014. About 97% of Google’s revenue of $60 billion in 2013 came from online advertising, and 95% of the ad revenue came from search engine marketing (Google, 2014). Because search engine marketing is so effective (it has the highest click-through rate and the highest return on ad investment), companies seek to optimize their Web sites for search engine recognition. The better optimized the page is, the higher a ranking it will achieve in search engine result listings. Search engine optimization (SEO) is the process of improving the quality and volume of Web traffic to a Web site by employing a series of techniques that help a Web site achieve a higher ranking with the major search engines when certain keywords and phrases are put into the search field. One technique is to make sure that the keywords used in the Web site description match the keywords likely to be used as search terms by prospective customers. For example, your Web site is more likely to be among the first ranked by search engines if it uses the keyword “lighting” rather than “lamps” if most prospective customers are searching for “lighting.” It is also advantageous to link your Web site to as many other Web sites as possible because search engines evaluate such links to determine the popularity of a Web page and how it is linked to other content on the Web. Search engines can be gamed by scammers who create thousands of phony Web site pages and link them altogether, or link them to a single retailer’s site in an attempt to fool Google’s search engine. Firms can also pay so-called “link farms” to link to their site. Google changed its search algorithm in 2012. Codenamed “Penguin,” the new algorithm examines the quality of links more carefully with the intent of down ranking sites that have a suspicious pattern of sites linking to them. Penguin is updated annually and published. Google and other search engine firms are attempting to refine search engine algorithms to capture more of what the user intended, and more the “meaning” of a search. Google introduced Hummingbird, its new search algorithm in September 2013. Rather than evaluate each word separately in a search, Google’s semantically informed Hummingbird will try to evaluate an entire sentence. So, if your search is a long sentence like “Google annual report selected financial data 2013,” Hummingbird should be able to figure out that you really want the SEC Form 10k report filed with the Securities and Exchange Commission on March 31, 2014. How about “Italian restaurant Brooklyn Bridge”? This will return the name and location of a number of Italian restaurants in vicinity of the Brooklyn Bridge. Semantic search more closely follows conversational search, or search as you would ordinarily speak it to another human being. Google’s predictive search is now a part of most search results. In predictive search, this part of the search algorithm guesses what you are looking for, and suggests search terms as you enter your search. Google searches also take advantage of Knowledge Graph, an effort of the search algorithm to anticipate what you might want to know more about as you search on a topic. Results of the knowledge graph appear on the right of the screen and contain more information about the topic or person you are searching on. In general, search engines have been very helpful to small businesses that cannot afford large marketing campaigns. Because shoppers are looking for a specific product or service when they use search engines, they are what marketers call “hot prospects”—-people who are looking for information and often intending to buy. Moreover, search engines charge only for click-throughs to a site. Merchants do not have to pay for ads that don’t work, only for ads that receive a click. Consumers benefit from search engine marketing because ads for merchants appear only when consumers are looking for a specific product. There are no pop-ups, Flash animations, videos, interstitials, e-mails, or other irrelevant communications to deal with. Thus, search engine marketing saves consumers cognitive energy and reduces search costs (including the cost of transportation needed to physically search for products). One study estimated the global value of search to both merchants and consumers to be more than $800 billion, with about 65 percent of the benefit going to consumers in the form of lower search costs and lower prices (McKinsey, 2011). Google and Microsoft face challenges ahead as desktop PC search growth slows, and revenues decline because the price of search engine ads is declining slightly. The growth in mobile search does not make up for the loss of desktop search revenues because mobile ads sell for generally half as much as desktop search ads. Social Search One problem with Google and mechanical search engines is that they are so thorough: Enter a search for “ultra computers” and in .2 seconds you will receive over 300 million reponses! Search engines are not very discriminating. Social search is an effort to provide fewer, more relevant, and trustworthy search results based on a person’s network of social contacts. In contrast to the top search engines that use a mathematical algorithm to find pages that satisfy your query, a social search Web site would review your friends’ recommendations (and their friends’), their past Web visits, and their use of “Like” buttons. In January 2013 Facebook launched Graph Search, a social network search engine that responds to user search queries with information from the user’s social network of friends and connections. Graph Search relies upon the huge amount of data on Facebook that is, or can be, linked to individuals and organizations. You might use Graph Search to search for Boston restaurants that your friends like, alumni from the University of South Carolina who like Lady Gaga, or pictures of your friends before 2010. Google has developed Google +1 as a social layer on top of its existing search engine. Users can place a +1 next to the Web sites they found helpful, and their friends will be notified automatically. Subsequent searches by their friends would list the +1 sites recommended by friends higher up on the page. One problem with social search is that your close friends may not have intimate knowledge of topics you are exploring, or they may have tastes you don’t appreciate. It’s also possible your close friends don’t have any knowledge about what you are searching for. Semantic Search Another way for search engines to become more discriminating and helpful is to make search engines that could understand what it is we are really looking for. Called “semantic search” the goal is to build a search engine that could really understand human language and behavior. For instance, in 2012 Google’s search engine began delivering more than millions of links. It started to give users more facts and direct answers, and to provide more relevant links to sites based on the search engine’s estimation of what the user intended, and even on the user’s past search behavior. Google’s search engine is trying to understand what people are most likely thinking about when they search for something. Google hopes to use its massive database of objects (people, places, things), and smart software to provide users better results than just millions of hits. For instance, do a search on “Lake Tahoe” and the search engine will return basic facts about Tahoe (altitude, average temperature, and local fish), a map, and hotel accommodations (Efrati, 2012). Although search engines were originally designed to search text documents, the explosion of photos and videos on the Internet created a demand for searching and classifying these visual objects. Facial recognition software can create a digital version of a human face. In 2012 Facebook introduced its facial recognition software and combined it with tagging, to create a new feature called Tag Suggest. The software creates a digital facial print, similar to a finger print. Users can put their own tagged photo on their timeline, and their friend’s timelines. Once a person’s photo is tagged, Facebook can pick that person out of a group photo, and identify for others who is in the photo. You can also search for people on Facebook using their digital image to find and identify them. Intelligent Agent Shopping Bots Chapter 11 describes the capabilities of software agents with built-in intelligence that can gather or filter information and perform other tasks to assist users. Shopping bots use intelligent agent software for searching the Internet for shopping information. Shopping bots such as MySimon or PriceGrabber can help people interested in making a purchase filter and retrieve information about products of interest, evaluate competing products according to criteria the users have established, and negotiate with vendors for price and delivery terms. Many of these shopping agents search the Web for pricing and availability of products specified by the user and return a list of sites that sell the item along with pricing information and a purchase link. Web 2.0 Today’s Web sites don’t just contain static content—they enable people to collaborate, share information, and create new services and content online. These second-generation interactive Internet-based services are referred to as Web 2.0. If you have shared photos over the Internet at Flickr or another photo site, pinned a photo on Pinterest, posted a video to YouTube, created a blog, or added an app to your Facebook page, you’ve used some of these Web 2.0 services. Web 2.0 has four defining features: interactivity, real-time user control, social participation (sharing), and user-generated content. The technologies and services behind these features include cloud computing, software mashups and apps, blogs, RSS, wikis, and social networks. Mashups, which we introduced in Chapter 5, are software services that enable users and system developers to mix and match content or software components to create something entirely new. For example, Yahoo’s photo storage and sharing site Flickr combines photos with other information about the images provided by users and tools to make it usable within other programming environments. Web 2.0 tools and services have fueled the creation of social networks and other online communities where people can interact with one another in the manner of their choosing. A blog, the popular term for a Weblog, is a personal Web site that typically contains a series of chronological entries (newest to oldest) by its author, and links to related Web pages. Blogging is a major activity for U.S. Internet users: 74 million read blogs, and 22 million write blogs or post to blogs. The blog may include a blogroll (a collection of links to other blogs) and trackbacks (a list of entries in other blogs that refer to a post on the first blog). Most blogs allow readers to post comments on the blog entries as well. The act of creating a blog is often referred to as “blogging.” Blogs can be hosted by a third-party service such as Blogger.com, TypePad.com, and Xanga.com, and blogging features have been incorporated into social networks such as Facebook and collaboration platforms such as Lotus Notes. WordPress is a leading open source blogging tool and content management system. Microblogging, used in Twitter, is a type of blogging that features short posts of 140 characters or less. Blog pages are usually variations on templates provided by the blogging service or software. Therefore, millions of people without HTML skills of any kind can post their own Web pages and share content with others. The totality of blog-related Web sites is often referred to as the blogosphere. Although blogs have become popular personal publishing tools, they also have business uses (see Chapters 2 and 10). If you’re an avid blog reader, you might use RSS to keep up with your favorite blogs without constantly checking them for updates. RSS, which stands for Really Simple Syndication or Rich Site Summary, pulls specified content from Web sites and feeds it automatically to users’ computers. RSS reader software gathers material from the Web sites or blogs that you tell it to scan and brings new information from those sites to you. RSS readers are available through Web sites such as Google and Yahoo, and they have been incorporated into the major Web browsers and e-mail programs. Blogs allow visitors to add comments to the original content, but they do not allow visitors to change the original posted material. Wikis, in contrast, are collaborative Web sites where visitors can add, delete, or modify content on the site, including the work of previous authors. Wiki comes from the Hawaiian word for “quick.” Wiki software typically provides a template that defines layout and elements common to all pages, displays user-editable software program code, and then renders the content into an HTML-based page for display in a Web browser. Some wiki software allows only basic text formatting, whereas other tools allow the use of tables, images, or even interactive elements, such as polls or games. Most wikis provide capabilities for monitoring the work of other users and correcting mistakes. Because wikis make information sharing so easy, they have many business uses. The U.S. Department of Homeland Security’s National Cyber Security Center (NCSC) deployed a wiki to facilitate collaboration among federal agencies on cybersecurity. NCSC and other agencies use the wiki for real-time information sharing on threats, attacks, and responses and as a repository for technical and standards information. Pixar Wiki is a collaborative community wiki for publicizing the work of Pixar Animation Studios. The wiki format allows anyone to create or edit an article about a Pixar film. Social networking sites enable users to build communities of friends and professional colleagues. Members typically create a “profile,” a Web page for posting photos, videos, MP3 files, and text, and then share these profiles with others on the service identified as their “friends” or contacts. Social networking sites are highly interactive, offer real-time user control, rely on user-generated content, and are broadly based on social participation and sharing of content and opinions. Leading social networking sites include Facebook, Twitter (with 1.3 billion and 270 million monthly active users respectively in 2014), and LinkedIn (for professional contacts). For many, social networking sites are the defining Web 2.0 application, and one that has radically changed how people spend their time online; how people communicate and with whom; how business people stay in touch with customers, suppliers, and employees; how providers of goods and services learn about their customers; and how advertisers reach potential customers. The large social networking sites are also morphing into application development platforms where members can create and sell software applications to other members of the community. Facebook alone has over 1 million developers who created over 550,000 applications for gaming, video sharing, and communicating with friends and family. We talk more about business applications of social networking in Chapters 2 and 10, and you can find social networking discussions in many other chapters of this book. You can also find a more detailed discussion of Web 2.0 in our Learning Tracks. Web 3.0 and the Future Web Americans conducted about 19 billion searches in January 2014 (comScore, 2014). How many of these produced a meaningful result (a useful answer in the first three listings)? Arguably, fewer than half. Google, Yahoo, Microsoft, and Amazon are all trying to increase the odds of people finding meaningful answers to search engine queries. But with over 500 billion Web pages indexed, the means available for finding the information you really want are quite primitive, based on the words used on the pages, and the relative popularity of the page among people who use those same search terms. In other words, it’s hit or miss. To a large extent, the future of the Web involves developing techniques to make searching the 500 billion public Web pages more productive and meaningful for ordinary people. Web 1.0 solved the problem of obtaining access to information. Web 2.0 solved the problem of sharing that information with others and building new Web experiences. Web 3.0 is the promise of a future Web where all this digital information, all these contacts, can be woven together into a single meaningful experience. Sometimes this is referred to as the Semantic Web. “Semantic” refers to meaning. Most of the Web’s content today is designed for humans to read and for computers to display, not for computer programs to analyze and manipulate. Semantic Search, described above, is a subset of a larger effort to make the Web more intelligent, more humanlike (W3C, 2012). Search engines can discover when a particular term or keyword appears in a Web document, but they do not really understand its meaning or how it relates to other information on the Web. You can check this out on Google by entering two searches. First, enter “Paris Hilton”. Next, enter “Hilton in Paris”. Because Google does not understand ordinary English, it has no idea that you are interested in the Hilton Hotel in Paris in the second search. Because it cannot understand the meaning of pages it has indexed, Google’s search engine returns the most popular pages for those queries where “Hilton” and “Paris” appear on the pages. First described in a 2001 Scientific American article, the Semantic Web is a collaborative effort led by the World Wide Web Consortium to add a layer of meaning atop the existing Web to reduce the amount of human involvement in searching for and processing Web information. For instance, the New York Times launched a semantic application called Longitude which provides a graphical interface to access the Times content. You can ask for stories about Germany in the last 24 hours, or a city in the United States, to retrieve all recent stories in the Times. Views on the future of the Web vary, but they generally focus on ways to make the Web more “intelligent,” with machine-facilitated understanding of information promoting a more intuitive and effective user experience. For instance, let’s say you want to set up a party with your tennis buddies at a local restaurant Friday night after work. One problem is that you are already scheduled to go to a movie with another friend. In a Semantic Web 3.0 environment, you would be able to coordinate this change in plans with the schedules of your tennis buddies and the schedule of your movie friend, and make a reservation at the restaurant all with a single set of commands issued as text or voice to your handheld smartphone. Right now, this capability is beyond our grasp. Work proceeds slowly on making the Web a more intelligent experience, in large part because it is difficult to make machines, including software programs, that are truly intelligent like humans. But there are other views of the future Web. Some see a 3-D Web where you can walk through pages in a 3-D environment. Others point to the idea of a pervasive Web that controls everything from a city’s traffic lights and water usage, to the lights in your living room, to your car’s rear view mirror, not to mention managing your calendar and appointments. This is referred to as the “Internet of Things.” The Internet of Things includes the widespread use and distribution of sensors. Firms like IBM, HP, and Oracle are exploring how to build smart machines, factories, and cities through extensive use of remote sensors and fast cloud computing. We provide more detail on this topic in the following section. The “App Internet” is another element in the future Web. The growth of apps within the mobile platform is astounding: Over 80% of mobile minutes in the United States are generated through apps, only 20% using browsers. Apps give users direct access to content and are much faster than loading a browser and searching for content. The visual Web is another part of the future Web. The “visual Web” refers to Web sites like Pinterest where pictures replace text documents, where users search on pictures, and where pictures of products replace display ads for products. Pinterest is a social networking site that provides users (as well as brands) with an online board to which they can “pin” interesting pictures. Looking for a blue dress, or black dress shirt? Google will deliver thousands of links to sites that sell these items. Pinterest will deliver a much smaller collection of magazine quality photos linked subtly to vendor Web sites. Considered the fastest growing Web site in history, Pinterest has 70 million monthly users and was the 35th most popular Web destination in 2014. The Instagram app is another example of the visual Web. Instagram is a photo and video sharing site that allows users to take pictures, enhance them, and share them with friends on other social sites like Facebook, Twitter, Tumblr, and Google+. In 2014 Instagram had 220 million monthly active users. Other complementary trends leading toward a future Web 3.0 include more widespread use of cloud computing and software as a service (SaaS) business models, ubiquitous connectivity among mobile platforms and Internet access devices, and the transformation of the Web from a network of separate siloed applications and content into a more seamless and interoperable whole. These more modest visions of the future Web 3.0 are more likely to be realized in the near term.

 

 

 

Solution Preview

Internet in the Workplace

Question One

The increase in independence on the internet and technology in the workplace has opened up a new communication challenge for most employees within organizational settings. Management teams in contemporary society now need to not only be wary of the potential loss of information to cases of disasters in the workplace environment, but also potential unauthorized access to the company databases through the internet network (Jamaluddin, Ahmad, Alias, & Simun, 2015). Given the necessity for extensive security over some of the information stored on the organization’s database, some employers go to the extent of monitoring the employees’ internet activities as a means of protecting the company data from potentially dangerous sites.

(615 words)

Open chat
Hello
Contact us here via WhatsApp