13 Compelling Facts About Work In The 21st Century

We’re two decades into the 21st century, and the way we do work has changed substantially from previous centuries. In the past, limited technology meant that what we did was more directly tied into a wage-per-item concept of work. What you made or did determined your wage in a direct way. Manufacturing, agriculture, and the office commute were the mainstays.

But no more.

Changes in technology, a shift in the culture, and financial challenges have shifted both the kind of work we do and the way we do it. What does work in the 21st century look like, for the worker, for the bank account, and for society?

Workers are working longer than before.

Americans are working longer than ever now, with much more of their week is claimed by work instead of by free time.

If you thought technology would make life easier, and allow us to spend less time working, it looks like you were wrong. Around the turn of the 21st century, Americans were working, on average, about 47 hours a week. Though it’s held steady for the past 14 years, that’s almost a full work-day more than what a typical 40-hour work week would entail.

But this isn’t all bad news. How we work now has changed (telecommuting, mobile devices, etc.), and that has made that extra time each week seem like much less.

Workers are retiring at a later age.

No only are Americans working longer each week, but they are retiring a bit later.

Gallup has been polling workers on this retirement question since 1991, and they’ve learned a few interesting contrasts about retirement expectations and retirement realities.

Most Americans expect to retire at age 66, but on average, actually retire earlier at age 62. However, in the early 1990s, people were retiring at around age 57 even though they had similar higher-age expectations. So while our expectations haven’t changed much, the actual retirement age has increased as we’ve moved into the 21st century.

Expectations vary based on if you’re a young worker or an older worker.

Younger Americans expect to retire around age 55, but as they get older (and perhaps understand the financial realities of retirement), their expectations change and fall more in line with those initial numbers, expecting retirement around age 65. These retirement expectations have remained fairly steady into the 21st century.

What’s behind this slow but steady rise in the retirement age? Part of it is a troubled economy that forces workers to stay employed longer out of financial need, but part of it is that some baby boomers just don’t want to retire yet, choosing instead to work longer.

We aren’t doing a great job preparing for retirement.

Whether or not workers think they’ll retire at 55 or 65, they tend to view retirement highly. Most imagine retirement as a time of leisure and great travel. The problem is, they are disconnected from what they think they’ll have and what they are doing to prepare for it.

Only 33 percent of Americans aren’t doing much to prepare for that retirement that they imagine. 60 percent of workers have less than $25,000 tucked away. These are fairly dangerous numbers, considering that preparing for retirement is falling more and more on each person every day. Companies that offer pension plans and the retirement plans of the 20th century are becoming few and far between. It is falling to the the worker to prepare for his or her own retirement in this new age.

There’s a new job: the data scientist.

The end of the 20th century was the start of the information age. As the 21st century picked up steam, all of that information (some might say information glut) meant the rise of big data on everything from customer behavior on social sites or geo-location habits. This data meant that the need for data scientists grew quite rapidly.

This is such new territory that the term “data scientist” is relatively new, coined by leading social media team members in 2008. You’ll find these data scientists in startups and established businesses, and they are so in need that there is a shortage. Why?

Businesses are drowning in massive amounts of data and they need people who know how to interpret it correctly.

Jobs have left the farm.

Jobs are changing from what they used to be. For starters, the workforce has become increasingly urban, with agricultural jobs declining steadily into the 21st century. Technology has brought about the slow end to small family farms, making it possible for farms to be large and require fewer workers.

Urban centers, on the other hand, have grown dramatically. That’s where the rural workers are headed towards.

Along with this shift, manufacturing jobs have declined while the service industry (particularly healthcare, which we’ll cover in a minute) has picked up the slack. Despite the fact that the total amount of people working in manufacturing has lessened, manufacturing itself still accounts for nearly 30 percent of the United States’ gross domestic product, the say as the past thirty years.

Clearly, technology has reduced the workers needed for manufacturing in the 21st century, but not harmed the output.

The top industries have changed.

In the U.S., the top industries have changed from the end of the 20th century to now. According to the Bureau of Labor Statistics, the top industries were manufacturing and retail. By 2013, healthcare (which had barely registered in the 1990’s) was by far the top industry. As our population ages, the healthcare industry will continue to grow.

What industries will face significant decline?

According to the Bureau of Labor Statistics, manufacturing, postal workers, and publishers of newspapers, books, and periodicals are going to dwindle. That’s quite a flip, from one century to another.

Workers aren’t in it for the long haul.

Despite the shift in the kinds of jobs that are now available, not everyone is ready for this 21st century workforce. There is a disconnect between the jobs that are in demand and the training that is available.

Gone are the days of the worker who dedicates 40 years to one company, to then retire. The 21st century is the era of the pivot, and the pursuit of doing what you love. Workers are creating careers that look a bit like a buffet, sticking with a job from two to five years on average before changing jobs. Workers today are constantly thinking about the next move, as their career is about what they make of it. It’s up to them, not a single company, to build their career.

This is a challenge. 21st century workers are still being trained with a 20th century model, learning skills for specific careers instead of for a buffet.

Workers are living on a financial edge.

The 21st century, with its easy credit and rough economy, has created a breed of worker who lives on the financial edge. According to Gallup, while about one third of workers believe they are financially stable, and that they would be able to live for up to a year without a job, their perception may not be totally accurate. 50 percent of American workers think it unlikely that they will lose their job, and that has an effect on how positive they view their own financial situation.

Lower income and younger workers are more likely to lose their jobs. The older you are, the further from that financial edge you get. Gallup discovered that 60 percent of workers under the age of 35 could only last a month or less before “experiencing financial hardship.” Once you get past age 35, that hardship drops to only 40 percent of the workers.

But 40 percent is still a lot, and is explained, perhaps, because 1) the habit of saving money is one that is fading in the newer generations, and 2) they are making less money and it’s affecting their ability to safe.

Wages are up, but buying power hasn’t increased.

Surprised to hear that today’s worker is making less money? It might not seem so, until you take other factors into consideration. According to the U.S. Bureau of Labor Statistics, the average hourly wage for non-management private-sector workers for September 2014 was $20.67. That’s an increase in hourly wage from the previous century, but what does it really mean?

Once you take inflation into account, buying power has remained basically the same as it was in…1979. In fact, in terms of buying power, wages peaked for workers in 1973. That doesn’t sound very 21st century.

Working from home is on the increase.

Despite Yahoo! putting the kabosh on workers telecommuting, the trend is for workers to work from home in an ever-increasing amount. Right now, 20 percent of people work from home in some form. In the next five years, that number will grow by 63 percent.

The great news is that 47 percent who have the option are very happy with it, as compared to 27 percent who have to work at the office. Most like the option because they want to avoid the commute to work, but greater flexibility, increased productivity, and saving money (transportation, lunch, day care) also play into it.

Mobile connectivity at work leads to stress.

The 21st century has certainly started off as the age of the smart mobile device.

Workers in the U.S. who spend time outside of work connected to email or working remotely on their “off time” experience heavier amounts of stress than workers who don’t. Almost 50 percent of workers who frequently respond to work email on their own time reported experiencing stress. Compare this to those who don’t participate in that kind of behavior; only 36 percent claim that kind of stress.

Oddly, those same high-stress workers rate their highly connected lives as better.

What’s the driving force behind stressful connectivity? Employers in the 21st century expect it now that it’s possible. 62 percent of workers who say their employers expect them to be available on their mobile device outside of work say they spend a lot of time emailing during those off hours.

The workforce is growing more diverse.

By 2050, the United States will not have any clear racial or ethnic majority. This trend towards diversity clearly has changed how the workforce of the 21st century looks. The growth of minorities, including women, has continued to grow as well, both in how many are in the workforce and what kinds of jobs they are now in.

Between 1980 to 2020, the end of one century and the beginning of the next, the white working-age population will decline. This is already happening, and it’s easy to see why: younger generations are made of minorities while the older workers who are retiring are mostly white. During this 40-year time period, minorities in the workforce will double from about 18 percent to 37 percent.

The concepts of what work is, has changed.

What we think work is, both in culture and how it fits into our lives, has changed in the 21st century. Entrepreneurship, including micro-entrepreneurship, has lead the way, with 21st century workers wanting to work for themselves instead of for someone else.

Concepts of coworking, a tough economy and job market, and technology that allows for a low-cost entry into owning their own business, have helped push younger workers (and some older ones) into owning their own businesses.

This drive towards entrepreneurship and making a success of a business plays into the steady increase in more hours per week being dedicated to work.

Technology has made significant changes in how and what work is done. Fewer people are required to generate the same manufactured output thanks to technology, allowing (or forcing) people to shift to urban centers and find other kinds of work. Technology has allowed human workers to be unshackled from the office, giving them greater freedom.

Work in the 21st century is about being flexible and mobile, ready for change both in skills and financial savings.

About The Author: Julie R. Neidlinger is a writer, artist, and pilot from North Dakota. She has been blogging since 2002 at her Lone Prairie blog, and works as a freelance writer and visual artist.

13 Compelling Facts About Work In The 21st Century