Sunmark will never ask for your personal account information (including passwords) via email, text message, or phone call.
Learn more about protecting yourself from fraud.

Log Into Online Banking

How Emergencies Like Coronavirus Expose Nonsense Business Practices

Early in the coronavirus outbreak in the United States, telecom company AT&T graciously opted to remove its broadband caps in response to the fact that a great many Americans were stuck at home for the foreseeable future. However, for some, the fact that AT&T and other wireless providers had broadband caps at all was pretty ungracious in the first place. In that sense, it’s part and parcel to an age-old process wherein a business practice that’s utterly absurd goes by largely unnoticed until a disaster calls attention to it.

Throughout American history, sometimes the only silver lining behind a tragedy is that it helps to expose a fault in the system. As Warren Buffett once said: “Only when the tide goes out do you discover who’s been swimming naked.”

So, as the economy is rocked by the extended slowdown needed to try to slow the spread of the coronavirus, it’s worth taking a look at how other major crises jarred society into making important changes. These are occasions where certain business practices were exposed to be unethical or even deadly.

Here’s how some of those past business practices were altered after a serious emergency exposed them as nonsense.

Last updated: April 6, 2020

The Triangle Shirtwaist Fire: What It Was

Labor laws in the late 19th century were, well, largely nonexistent. This was long before things such as a minimum wage or the eight-hour workday were established, so low-skill, low-wage workers routinely were forced into menial positions for 12 or more hours a day, six days a week, with no vacation, sick leave or other benefits — all for wages that left most of them barely surviving in crowded tenements. And if the grinding hours for low pay alone weren’t enough, safety standards in industrial workplaces were similarly absent, leaving millions of workers to put their lives on the line, day after day.

The Triangle Shirtwaist Company was one such employer, cramping more than 100 seamstresses into a workspace located on the eighth, ninth and 10th floors of a building in lower Manhattan. Women here struggled to organize and secure better working conditions, hours and wages, and ultimately never got the chance.

Read: These Billion-Dollar Companies Changed the Way We Live

What Happened

On March 25, 1911, a fire broke out in the workshop for the Triangle Shirtwaist Company. In the dirty conditions, with rags and other materials strewn about, the fire spread rapidly. What’s more, the doors to the shop had been locked as part of an effort to combat theft.

What followed was a horrifying tragedy in which about 150 women burned to death while others perished after trying to jump to safety. Pictures of the women who plunged to their death rather than be burned alive shocked the conscience of New York City and the nation, leaving many to begin questioning whether the laissez-faire approach to government regulation of business was morally defensible.

What Changed

While it took decades, the Triangle Shirtwaist fire played a crucial role in calling attention to the way millions of Americans were being exploited for their labor. What followed was the Progressive Era that helped push for significant changes, including child labor laws and new regulations dictating a standard for workplace safety. It culminated with a raft of new legislation in the New Deal that enshrined many essential aspects of our labor market today — including the 40-hour work week and a minimum wage — into law.

Johnstown Flood: What It Was

Pennsylvania’s South Fork Dam was constructed in 1852 as part of the Western Reservoir to support the canal between Johnstown and Pittsburgh, but the canal system went out of use soon after. The dam sat for years — without undergoing important maintenance — until it was purchased by the South Fork Fishing and Hunting Club. The club’s membership included a number of wealthy, prominent Pittsburgh elites, such as Andrew Carnegie, Andrew Mellon and Henry Clay Frick, who were looking for a country playground to escape the city life. The lake created by the dam proved to be just what they were looking for. Johnstown, a steel town with largely working-class inhabitants, remained distantly removed from these wealthy industrialists.

Meanwhile, the South Fork Fishing and Hunting Club made a series of modifications to the dam that made it less stable — such as lowering it so that a road could be built on top — and made no efforts to further repair or upgrade the structure.

What Happened

When a severe rainstorm hit the region on May 31, 1889, it dumped tons of excess water into the reservoir. The dam, long neglected and damaged by the changes made by the South Fork Fishing and Hunting Club, broke, dumping 20 million tons of water down into the valley below. The flash flood consumed Johnstown, drowning some 2,209 innocent people.

The grieving survivors ultimately continued to suffer as they tried to find some recourse in the courts for what had happened. Despite the astonishing negligence on the part of the members of the club that led to the disaster, the structure of their organization meant it was impossible for the survivors to hold the elites liable for their fatal mistakes.

Check Out: The Best and Worst Things About Working From Home

What Changed

The way that the incredibly rich people who were responsible for so much death and destruction managed to get off without so much as a slap on the wrist led to some serious thinking about the structure of the legal system. What followed was a fundamental shift when it came to liability. Since then, generations of Americans have been able to seek redress in the courts when they’ve been seriously wronged.

1920s Stock Markets: What It Was

As volatile as the world of trading stocks is today, it was far more chaotic in its early days. Before modern-day regulations on Wall Street, the market essentially operated as the Wild West. The biggest players routinely engaged in a variety of bad acts. The Wall Street insiders banded together to perpetrate schemes, such as the pump and dump (buying up a certain stock, using buzz to drive up the price and as other buyers poured in, selling at a profit) or outright price manipulations.

On the whole, it meant the basics of the stock market were tilted against the average investor in the worst way. The major brokers and financiers essentially could abuse the system however they might like, and as long as the roaring market kept rising, not a lot of people saw much reason to do anything about it. If anything, plenty of average investors started running up debt to play the market — what’s known as “buying on margin” — thinking that they might someday achieve the riches of those men at the top of the pyramid, even as those men were exploiting them for those profits.

What Happened

The Roaring ’20s were a time of excess, free spending and easy credit that helped paper over the worst abuses. However, when the fundamentally weak economy finally caught up with America, markets crashed hard, leading to a massive deep freeze known as the Great Depression. Suddenly, the conduct of wealthy investors — or rather, con artists who called themselves “investors” — was under the microscope and Americans didn’t like what they saw.

What Changed

The market crash of 1929 helped make one fact explicitly clear: A stock market that can protect the average investors and make it relatively safe for them to put money in stocks is one that goes much further to support a growing economy. While the Hoover administration steadfastly refused to take serious action, the incoming Roosevelt administration wasted little time in making some major reforms. The Securities Act of 1933 included major legal changes to prevent the worst operators from continuing to exploit the broader public. This included stricter regulation of the issuance of securities and tighter rules about trading stocks to prevent price manipulation.

Overharvesting in the Plains States: What It Was

It’s long been stated that when the early pioneers started plowing America’s great plains, it sounded like a zipper as the plow cut through networks of grass roots that had grown into land that never before had been cultivated. For decades, it had been assumed that the fertile land of America’s interior was a virtually unlimited resource. That included the period during World War I when Herbert Hoover, the future president, pushed hard to get farmers to boost yields to help provide food to American allies overseas. It worked in the short term, as farmers responded positively and had a series of bumper crops. However, their approach to getting the most out of their land would prove catastrophically shortsighted.

What Happened

What farmers didn’t realize in the 1920s was that the top soil that made the plains so fertile had been held in place over the centuries by those wild grasses and their network of roots. Once it was being regularly tilled, that fertile top soil started to get blown away by the wind. It culminated in a massive, ongoing natural disaster known as the Dust Bowl — a tragedy made all the worse by striking in the midst of the Great Depression.

An untimely drought sparked off a massive disaster, including crop failures, farm foreclosures and dust storms that could get so bad they would reduce visibility to a foot or less in broad daylight. A generation of farmers was left moving around the country desperately searching for work. Known as “Okies” because many were from Oklahoma, their plight was detailed in the classic novel “The Grapes of Wrath.”

What Changed

The wet period that eventually ended the Dust Bowl didn’t last long, and another drought hit in the 1950s, producing similar problems. However, the farmers and the government had made important changes to help mitigate the issue and keep the soil healthy and where it belonged. That included rotating crops and planting “cover” crops that aren’t harvested but protect the land. What’s more, the four million or so acres of land claimed by the U.S. government during the Dust Bowl so it could be returned to its natural grassland ecology played a significant role in mitigating the problem. And finally, new technology helped farmers tap into the enormous Ogallala Aquifer so that they weren’t entirely dependent on rainfall for the water they needed to grow crops. In the decades that followed, American agriculture learned enough of a lesson to avoid making those same mistakes that prompted this disaster.

Enron: What It Was

Since the 1930s and the passage of the Securities Act, there has been an assumption that the average investor could make about public companies. If a company is going to be able to make shares available for the general public to purchase, it must engage in a number of practices that make company financials more transparent. Companies submit quarterly updates on their accounting to the Securities Exchange Commission, and anyone — whether an investor or not — can access those public filings to ensure that the company is financially healthy and structurally stable.

In turn, the largest American public companies — the blue-chip stocks — came to be seen as institutions that brought with them a promise of reliability. These are the sort of companies that financial advisors confidently recommend as part of a safe retirement plan.

Related: Enron and the 24 Other Most Epic Corporate Downfalls of All Time

What Happened

Unfortunately, just because a company submits quarterly reports about its finances doesn’t mean the updates are truthful. In the 1990s, Enron and its accounting firm, Arthur Andersen, engaged in wholesale criminal fraud that hid risky business practices and gave the impression of a very stable energy company experiencing rapid growth. This allowed it to become a mainstay among the retirement accounts of a great many Americans, people who thought they were investing in the sort of solid, large-cap company that would experience the occasional hiccup — like any other stock — but not the sort of drastic drops you see from smaller, riskier stocks.

Unfortunately, when the massive fraud was discovered and made public in late 2001, the ripple effect was dramatic and swift. The sudden collapse of the company meant a massive hit to investor portfolios and the market as a whole as people’s trust in public markets was severely shaken.

What Changed

You might have never heard of the Sarbanes-Oxley Act of 2002, but that bill ended up being one of the most impactful legislation in the securities industry since the 1930s. The bill led to a number of major changes to the way public companies were audited and regulated to ensure the sort of accounting shenanigans that allowed Enron to grow as large as it did wouldn’t happen again. The regulations installed onerous new standards that made defrauding the public much more difficult and helped return confidence to markets and protect investors.

The Housing Crisis: What It Was

Mortgage-backed securities actually had a long, prosperous history — the first was issued in 1968. It was a win-win. By pooling together thousands of mortgages into a single investment product, investors had an easy way to invest in the U.S. housing market. The additional demand created by investors helped make mortgages cheaper for everyone, and the investors got an excellent, safe investment with a long track record of success. Housing prices were on a steady, decades-long climb, and Americans typically were responsible about paying their mortgages.

That led to a spike in demand for these mortgage-backed securities, as investors clamored for a chance at what appeared to be safe returns. Investment banks borrowed massive sums, “leveraging” themselves so that they could invest even more. Dangerous as that might seem, the same investors and banks felt confident — both because the housing market was traditionally so stable and because they had used various complex instruments such as derivatives and credit default swaps to hedge their positions.

What Happened

The demand for more mortgage-backed securities meant investment banks needed more mortgages. Unfortunately, that demand led investment banks to push lenders for more mortgages, which in turn led mortgage lenders to start lowering their standards and take on more sub-prime borrowers. That meant the newer products were far riskier than the ones in the past, but the banks packaging them also put pressure on credit rating agencies not to reflect that. And those derivatives and swaps? Without regulation ensuring that the people issuing them had the money to back them up, there wasn’t any actual money backing them up.

It all came to a head — as you probably remember well — with the financial crisis of 2008. It slowly became clear that the entire system was a house of cards, prompting a collapse that was global in scope and required a massive bailout of both the banks and insurer AIG.

What Changed

The biggest change made to protect the economy from the sort of abuses that led to the financial crisis was the passage of the Dodd-Frank Act of 2010. The bill included requirements for the largest banks to maintain larger stockpiles of cash to protect them from abuses. There was also a rule to prevent the largest banks from engaging in any sort of speculation on their own. And while it’s still unclear just how successful these changes will be in the long term, they at least have forced banks to limit how much risk they take, given the important role they play in the broader economy.

Look: 21 Smartest Ways To Invest Your Money Right Now

Instacart Worker Relations: What It Is

The “gig economy” has changed a lot about how many Americans make a living, with part-time or on-demand work connected to an app on a smartphone becoming a way of life for a lot of people. That level of flexibility has been a boon for many, allowing them to fit their livelihood around other aspects of their life — be that a budding career as an actress or a need to care for a sick relative.

However, with that flexibility also comes a lack of stability and consistency that many typically associate with a job. The legal wrangling over just what these companies owe their workers beyond their wages has been bouncing from state to state for years as many gig workers increasingly find themselves working harder for less — usually without benefits or sick leave. That can leave many in vulnerable situations where having to miss work means losing income they probably can’t afford to miss.

One such company is the grocery delivery company Instacart, which pairs users with a “shopper” who will fill their order at the grocery store of their choice and deliver it to their door.

What Is Happening

The new coronavirus pandemic is tailor-made to help customers see the value of a service such as Instacart, and it can be a lifesaver — literally — for many people who are especially vulnerable and can’t chance a trip to the store. However, the crisis has exposed just how little support the company appears to offer its front-line employees.

Instacart shoppers organized a strike on March 30 in an effort to make the company shell out for some basic protections for its workers, with thousands of its 200,000 shoppers refusing to show up for work. Their objections? They insist that the two weeks of sick pay promised to any shoppers who contract the virus isn’t being honored and the qualifications currently required aren’t realistic. They also claim the company hasn’t provided them with basic safety and cleaning equipment to stay healthy, isn’t offering them hazard pay in spite of the critical state of affairs and won’t give sick leave to those shoppers with pre-existing conditions that make them especially vulnerable.

What Might Change

In this case, this is a speculative exercise. There’s no knowing yet just how Instacart leaders will respond to the strike, but it does remain possible that the coronavirus pandemic ultimately will shine a light on a great many employers who rely heavily on low-wage, contract workers for their business model. The sudden freezing of much of the American economy has hit hourly workers the hardest. They are people who often live from paycheck to paycheck and scarcely can afford to take time off. They also are among the least likely to have been able to build up an emergency fund prior to the pandemic.

While it’s going to be months — if not longer — before things return to normal, the odds are good that there’s going to be more support for redefining that “normal” to include basic protections for gig workers. Not to mention, the support for a long-overdue increase in the minimum wage likely will continue to build as the vulnerability of the lives of low-wage workers and their families have been on clear display in recent weeks.

Data Caps: What They Are

It’s hard to remember, given how they’ve come to dominate our everyday lives, but things like smartphones and even the internet are still relatively young. However, a majority of Americans didn’t have access to either just 30 years ago. The rapid growth in these services over the years has meant that internet service providers have needed to work hard to expand infrastructure to support the ever-increasing demand from streaming video services and other activities that use a lot of data.

To help them cope with the rapid increase, some ISPs put in place caps on data to limit how much users could stream or download in a month. The companies contend that it was a necessity for them to be able to offer everyone quality service at affordable prices without overtaxing the basic infrastructure they need to deliver that.

What Is Happening

Some ISPs, such as AT&T, have lifted their data caps during the coronavirus pandemic, acknowledging that people are going to be using significantly more data while they’re stuck at home and staying safe. However, that’s also calling attention to whether those data caps are necessary — or defensible — in this day and age. While the cost of providing data used to be relatively high, advances in technology and a more robust underlying infrastructure mean that data caps are largely arbitrary limits that don’t serve any particular purpose. With the basic cost of providing data reduced to almost nothing, ISPs are making enormous profits for these services — all while keeping those data caps in place.

And while it’s clearly the right move to relax them during this crisis, it’s also drawing some unwanted attention for the ISPs to their existence in the first place.

What Might Change

Clearly, not getting more internet data is a big step down from the human tragedies that claimed thousands of lives, and it’s an especially easy issue to address for the ISPs. At the very least, it’s not hard to see how enough public pressure would lead these consumer-facing companies to do away with this outdated policy. After all, the additional cost to them would be negligible, if anything.

The better question is whether or not this issue — not to mention, having so many Americans at home and using their internet so frequently — will lead to bigger questions about how ISPs operate. Namely, the nature of the marketplace that allows ISPs to operate as monopolies. As long as each company is the only game in town, there’s very little pressure for them to cater to customers’ wishes. Will Americans emerge from their coronavirus-induced semi-hibernation to demand that they have some actual choice in their ISP? Only time will tell.

More From GOBankingRates

This article originally appeared on GOBankingRates.comHow Emergencies Like Coronavirus Expose Nonsense Business Practices