television serves couch potatoes

Most television watching is best modeled as a two-stage decision process. First, a person decides to watch television. That means the person sits on a couch and stares vacantly at a large screen a few yards away. Then the person decides what to watch. That means choosing among current, salient video programming offerings. These two decisions are very loosely connected.

The behavior of persons who own a digital video recorder (DVR) is consistent with this decision model. In the U.S., households with a DVR use it for at most 25% of their television viewing time.[1] In the UK, households with a DVR use it even less — about 14% of television viewing time.[2] Most of the time, persons can’t be bothered to record and watch programs pre-selected from the huge universe of programs available to be recorded.

Persons don’t even bother to record programs so that they can skip advertising. When UK DVR owners were asked about how they use their DVR, 40% reported regularly fast-forwarding through adverts, while 42% reported never fast-forwarding through adverts. When specifically asked, 78% claimed to always or almost always fast-forward through adverts when using the DVR.[3] Evidently persons can’t remember well their immediate viewing behavior with respect to adverts. More significantly, persons who aren’t using their DVR surely aren’t fast-forwarding through adverts.

Average time spent watching television is likely to change neither quickly nor by a large amount in response to changes in the relative value of media use opportunities. Differences in video programming have little effect on aggregate television viewing time. New services offered on computer screens and mobile screens — video sharing, social networking, community news and information, in-depth learning opportunities — are similarly likely to have little effect on aggregate television view time. The amount of leisure time available (total working hours, weekday versus weekend) and socio-economic characteristics affecting broad patterns of life — educational attainment, employment status, presence of children at home — largely control television viewing time.

A recent IBM-sponsored survey has media pundits discussing the decline or explosion of television, but the survey actually provides rather weak evidence. The survey was an Internet-based survey, not a random sample of some relevant universe. Persons who respond to an Internet-based survey are likely to use the Internet more than average adults. U.S. respondents to the survey were 71% women and 27% persons ages 18-24, while U.S. adults (persons 18 and over) are 51% women and 13% ages 18-24.[4] Thus the survey demographics highly over-represent women and young adults.

Most significantly, persons who have commented on the results of the survey generally don’t seem to understand what was reported. The press release for the survey reported that “personal Internet time rivals TV time.” In the survey, “personal Internet time” meant Internet use at home and on “personal time at work.”[5] A survey in 2002 of a representative sample of U.S. adults found that employees with web access spent 3.7 hours per week in personal use of the Internet at work, and 5.9 hours per week using the Internet for work-related purposes at home. Both these time uses apparently count as “personal Internet use” in the IBM survey. Television isn’t a feasible alternative for either of those time uses. Most workers in the cushy private sector don’t have televisions in their offices, and watching television is almost never a work-related activity at home.

The challenge for traditional television isn’t that television viewing time will decline rapidly. The challenge is that traditional television advertising, compared to personalized, action-oriented, performance-measurable advertising, will decline rapidly in market value.

Notes:

[1] Reporting on a telephone survey, June-July 2007, of a random sample of 1,800 adults in households with a TV (and a telephone), Leichtman Research Group stated that “over one in every five households” had a DVR and estimated that “95% of all TV viewing in the U.S. is still of live TV.” These data imply that no less than 75% of DVR owners’ TV viewing time is live viewing, i.e. distributor-scheduled programming. The extent to which persons record and watch television programs on analog videocassette recorders raises the estimated DVR owners’ live TV viewing time. So does the extent to which DVR ownership is over 20%. An IBM-sponsored Internet survey found 24% of persons in the U.S. owned a DVR in April, 2007. See U.S. findings, p. 9. As discussed subsequently above, this sample isn’t representative of the U.S. adult population.

[2] Spring, 2006 BARB measurements in households with Sky+ DVR.

[3] Ofcom, The Communications Market 2007, Section 1 Converging communications markets, p. 85. In Q1 2007, 15% of UK homes had DVRs, almost double the 2006 figure. See id. p. 69.

[4] See U.S. study findings, p. 4, compared to U.S. census data.

[5] U.S. study findings, p. 7, comparing “Daily Personal Internet Usage; Home and Personal Time at Work” to “Daily Television Viewing.”

5 thoughts on “television serves couch potatoes”

  1. Pingback: Reasoned Audacity

Leave a Reply

Your email address will not be published. Required fields are marked *

Current month ye@r day *