The US Funded Universal Childcare During World War II—Then Stopped

The US Funded Universal Childcare During World War II—Then Stopped

When the United States started recruiting women for World War II factory jobs, there was a reluctance to call stay-at-home mothers with young children into the workforce. That changed when the government realized it needed more wartime laborers in its factories. To allow more women to work, the government began subsidizing childcare for the first (and only) time in the nation’s history.

An estimated 550,000 to 600,000 children received care through these facilities, which cost parents around 50 to 75 cents per child, per day (in 2021, that’s less than $12). But like women’s employment in factories, the day care centers were always meant to be a temporary wartime measure. When the war ended, the government encouraged women to leave the factories and care for their children at home. Despite receiving letters and petitions urging the continuation of the childcare programs, the U.S. government stopped funding them in 1946.

WATCH: World War II Documentaries on HISTORY Vault

WWII Highlights Need for Childcare

Before World War II, organized “day care” didn’t really exist in the United States. The children of middle- and upper-class families might go to private nursery schools for a few hours a day, says Sonya Michel, a professor emerita of history, women’s studies and American studies at the University of Maryland-College Park and author of Children’s Interests/Mothers’ Rights: The Shaping of America’s Child Care Policy. (In German communities, five- and six-year-olds went to half-day Kindergartens.)

For children from poor families whose father had died or couldn’t work, there were day nurseries funded by charitable donations, Michel says. But there were no affordable all-day childcare centers for families in which both parents worked—a situation that was common for low-income families, particularly Black families, and less common for middle- and upper-class families.

The war temporarily changed that. In 1940, the United States passed the Defense Housing and Community Facilities and Services Act, known as the Lanham Act, which gave the Federal Works Agency the authority to fund the construction of houses, schools and other infrastructure for laborers in the growing defense industry. It was not specifically meant to fund childcare, but in late 1942, the government used it to fund temporary day care centers for the children of mothers working wartime jobs.

PHOTOS: Real-Life Rosie the Riveters

Communities had to apply for funding to set up day care centers; once they did, there was very little federal involvement. Local organizers structured childcare centers around a community’s needs. Many offered care at odd hours to accommodate the schedules of women who had to work early in the morning or late at night. They also provided up to three meals a day for children, with some offering prepared meals for mothers to take with them when they picked up their kids.

“The ones that we often hear about were the ‘model’ day nurseries that were set up at airplane factories [on the West coast],” says Michel. “Those were ones where the federal funding came very quickly, and some of the leading voices in the early childhood education movement…became quickly involved in setting [them] up,” she says.

For these centers, organizers enlisted architects to build attractive buildings that would cater to the needs of childcare, specifically. “There was a lot of publicity about those, but those were unusual. Most of the childcare centers were kind of makeshift. They were set up in [places like] church basements.”

Though the quality of care varied by center, there hasn’t been much study of how this quality related to children’s race (in the Jim Crow South, where schools and recreational facilities were segregated, childcare centers were likely segregated too). At the same time the United States was debuting subsidized childcare, it was also incarcerating Japanese American families in internment camps. So although these childcare facilities were groundbreaking, they didn’t serve all children.

READ MORE: ‘Black Rosies’: The Forgotten African American Heroines of the WWII Homefront

Subsidized Childcare Ends When War Ends

When the World War II childcare centers first opened, many women were reluctant to hand their children over to them. According to Chris M. Herbst, a professor of public affairs at Arizona State University who has written about these programs in the Journal of Labor Economics, a lot of these women ended up having positive experiences.

“A couple of childcare programs in California surveyed the mothers of the kids in childcare as they were leaving childcare programs,” he says. “Although they were initially skeptical of this government-run childcare program and were worried about the developmental effects on their kids, the exit interviews revealed very, very high levels of parental satisfaction with the childcare programs.”

As the war ended in August 1945, the Federal Works Agency announced it would stop funding childcare as soon as possible. Parents responded by sending the agency 1,155 letters, 318 wires, 794 postcards and petitions with 3,647 signatures urging the government to keep them open. In response, the U.S. government provided additional funding for childcare through February 1946. After that, it was over.

Lobbying for national childcare gained momentum in the 1960s and ‘70s, a period when many of its advocates may have themselves gone to World War II day care as kids. In 1971, Congress passed the Comprehensive Child Development Act, which would have established nationally-funded, locally-administered childcare centers.

This was during the Cold War, a time when anti-childcare activists pointed to the fact that the Soviet Union funded childcare as an argument for why the United States shouldn’t. President Richard Nixon vetoed the bill, arguing that it would “commit the vast moral authority of the National Government to the side of communal approaches to child rearing over against the family-centered approach.”

In this case, “family-centered” meant the mother should care for the children at home while the father worked outside of it—regardless of whether this was something the parents could afford or desired to do. World War II remains the only time in U.S. history that the country came close to instituting universal childcare.


Universal childcare isn’t only possible in the U.S., we’ve done it before

As America emerges from the Dark Year of the COVID-19 pandemic, one thing is crystal clear: we have a childcare crisis in this country. While jobs are opening up nearly everywhere, employers struggle to find employees to fill their vacant positions. One reason is that the jobs people are being offered pay less than the ones they left due to the pandemic.

However, a bigger issue is that childcare has become unaffordable for most families now more than ever. A significant portion of the potential workforce are having to chose between staying home with their kids or taking a job that barely covers their childcare costs.

On top of that, despite the high costs of childcare, childcare providers aren’t being paid wages commensurate with their service to society. As one person put it to me recently, childcare isn’t too expensive, it’s unaffordable, and childcare workers deserve every single cent they earn and then some.

This result of this will be an ever-widening gulf between wealthy families and those of more modest means. It also means more children in poverty with worse developmental opportunities. It is a real problem that will require a federal response if the U.S. economy is ever going to return to its pre-pandemic robustness.

The answer: Universal childcare.

You might think that universal childcare is a pipe dream that could never happen in America. But, the truth is, we have done this before.

On June 29, 1943, the U.S. Senate passed legislation that created the first and, so far, ONLY national childcare program in American history. It was part of a larger program passed earlier known as the Lanham Act which funded infrastructure around the country to support the wartime effort. Two newly created agencies, War Public Works (WPW) and War Public Services (WPS), were responsible for executing the provisions of the Lanham Act and WPS focused almost exclusively on childcare.

There was an incredible need for this program because women, who had been a tiny proportion of the workforce prior the start of World War II in 1939, were making up an increasing share. At the height of the war, in fact, nearly a quarter of all married women were working outside the home and 36% of all women of working age were in the workforce. They worked in a number of industries making things like munitions and, like the famous Rosie the Riveter, building aircraft. These women were desperate for reliable childcare and so were the companies they worked for who were struggling with absenteeism in women who had limited or no access to care for their children.

Congress dedicated $52 million for this effort, an amount equivalent to over $800 million in 2021 dollars. With contributions from state and local governments, a total of $78 million (over $1.2 billion today) was dedicated to making universal childcare a reality in America.

Communities, mostly through user fees, contributed an additional $26 million. At its July 1944 peak, 3,102 federally subsidized child care centers, with 130,000 children enrolled, were located in all but one state and in D.C. By the end of the war, between 550,000 and 600,000 children are estimated to have received some care from Lanham Act programs.

As the war began to turn in the direction of the Allies, the funding began to dry up. However, they added another $7 million (almost $108 million today) after Americans around the country aggressively lobbied Congress.

[A]fter the drop in war production needs following the spring 1945 Allied victory in Europe, the FWA granted fewer project approvals or renewals. In mid-August 1945, once victory in Japan was assured, the agency announced all Lanham Act funding of childcare centers would cease as soon as possible, but in no case later than the end of October 1945. Approximately 1 month after this announcement, the FWA reported it had received 1,155 letters, 318 wires, 794 postcards, and petitions signed by 3,647 individuals urging continuation of the program. Principle reasons given were the need of servicemen’s wives to continue employment until their husbands returned, the ongoing need of mothers who were the sole support of the children, and inadequacy of other forms of care in the community.

The success of the Lanham Act in answering the country’s need for widespread, affordable childcare is a vivid demonstration of what we can do collectively when we are in a moment of crisis. What drove the need in the early 1940s was the large proportion of women in the workplace. And, in 2021, that proportion is even higher than it was at the height of World War II. Today, nearly 60% of women work outside the home, over 1½ times the number during the war. But in 2021, families do not have access to the type of programs created by the Lanham Act.

If there was ever proof the childcare is infrastructure, the Lanham Act is it. We need a 21st Century Lanham Act for America to make sure our economy surges back and no family is left behind.


Study: 1940s-Era Universal Child Care Program Had Positive Effects on Children

By Christina A. Samuels — January 15, 2014 3 min read

A World War II-era child care program that allowed mothers to enter the workforce in record numbers led to positive effects in children that lasted into their adulthood, says an analysis by Chris M. Herbst, a professor at Arizona State University . (Herbst has also condensed his report into slides .) The benefits were particularly strong in economically disadvantaged adults, said the paper, which was released last month. The program was also unique in that it generated both increases in employment for women and supported positive outcomes for children, said Herbst, who has also conducted research into the modern-era Child Care Development Block Grant program .

During World War II, the federal government used the Lanham Act of 1940—legislation that provided federal grants or loans for war-related infrastructure projects—to pay for child-care facilities in areas where many women were employed in defense-related industries. The Lanham Act centers operated for just three years, from 1943 to 1946, eventually serving about 600,000 children ages 0 to 12, regardless of family income. At its peak in 1944, 130,000 children were enrolled in centers. Lanham Act spending was highest in California, Washington, Oregon, Florida and Arizona, where many defense-related industries were located.

Lanham Act centers varied in quality, Herbst said in his paper: Some centers operated on a 24-hour schedule to accommodate factory shifts, and in some cases, preschoolers spent 12 hours a day in care, where they were provided hot meals and time to play and nap. Older children who were enrolled in school received before- and after-school care.

Children in centers in California were given a medical exam, parents were asked the developmental history of their children, teachers received in-service training and college credit. But in contrast, a Baltimore center reportedly housed 80 children in one room with one bathroom prepared meals on a hot plate and required children to cross a highway to get to a playground.

At the end of the war, the federal government abruptly cut funding, which allowed Herbst to make comparisons between the group of children who would have been eligible for enrollment because of their ages—those born in 1931 to 1946—to an otherwise similar cohort of children born in 1947 to 1951, who would not have been eligible. Using census data from 1970, 1980 and 1990, Herbst found that children in the group that would have been affected by the existence of the Lanham Centers had higher rates of graduation, marriage, and employment.

The census data does not track adults by whether they were enrolled in the center. But Herbst hypothesized that even children who were not directly enrolled in Lanham Centers during their brief period of operation were affected by their presence, because the centers altered the views of families about institutionalized child care. Some of those mothers might have enrolled their children in other centers , for example.

The existence of the centers also helped more women enter the workforce, as was their intent. That increased employment continued for several years after the program was shuttered, and was seen among married and unmarried women and low- and high-skilled employees.

Herbst said that are a few takeaways from his report that are relevant for today’s policy discussions around child care: First, “the benefits appear to be particularly large for the most economically disadvantaged adults,” and that could be linked to the fact that children from varying economic backgrounds were cared for together. The higher-income children could have caused spillover positive effects in their classmates, he said.

A second noteworthy element of Lanham Centers is that mothers were not required to work to take advantage of the care, even though most did work. That makes them stand out against the child care block grants , where quality often takes a back seat to a parent’s need to find any available center in order to hold on to benefits, Herbst said. Without that pressure to find a job, mothers were able to look around for best fit and for higher quality centers, the paper hypothesizes.


The Lanham Act and Universal Childcare During World War II

On June 29, 1943, the U.S. Senate passed the first, and thus far only, national childcare program, voting $20,000,000 to provide for public care of children whose mothers were employed for the duration of World War II.

During the war, the federal government offered grants for child care services to authorized community groups that could demonstrate a war-related need for the service. The program was justified as a war expedient necessary to allow mothers to enter the labor force and increase war production.

Funding authorization came through the 1941 Defense Public Works law (Title II of the 1940 National Defense Housing Act), popularly known as the Lanham Act. The law was designed to assist communities with water, sewer, housing, schools, and other local facilities’ needs related to war and industry growth. This act was one of several Congress passed giving general defense mandates to the Federal Works Administration (FWA). The FWA, established in 1939, was created to oversee and coordinate the activities of five major New Deal alphabet agencies. With the outbreak of WWII in Europe in 1939, the FWA began to shift its mandate from these New Deal programs to defense mobilization, even before the attack on Pearl Harbor in December 1941.

Children at a childcare center sit for “story time.” (Gordon Parks / Library of Congress / The Crowley Company)

Both during the pre-Pearl Harbor mobilization and during the nation’s direct military involvement, the FWA was given three general defense mandates: (1) build and expand the system of strategic roads to better serve the needs of national defense, (2) construct sufficient defense worker housing, and (3) support the workers in such housing by providing public works assistance to cities that saw a large influx of workers which, in some cases, doubled or tripled the area’s population. It was because of this mandate to provide public works assistance that the 1941 Lanham Act was passed. In total, 4,000 projects were approved under the Lanham Act during the war at a cost of $457 million, $351 million of which was federally funded with the remainder funded by the affected communities.

The newly created War Public Works (WPW) and War Public Services (WPS) had the primary responsibility to carry out these duties. The WPS’s largest role, both in funds and number of projects, was providing day care for the children of women who took jobs in factories or worked in government offices to help the war effort. By the conclusion of the war, the WPS had created day care centers in 386 communities. Without this service, many of the six million women who participated in the war effort could not have done so.

The federal government granted $52 million for childcare under this Act from August 1943 through February 1946, which is equal to more than $1 billion today. Communities, mostly through user fees, contributed an additional $26 million. At its July 1944 peak, 3,102 federally subsidized child care centers, with 130,000 children enrolled, were located in all but one state and in D.C. By the end of the war, between 550,000 and 600,000 children are estimated to have received some care from Lanham Act programs.

Children having lunch with a trained assistant at a childcare center opened September 15, 1942, in New Britain, Connecticut. The center served thirty children, aged two to five, of mothers engaged in war industry. (Gordon Parks/Library of Congress)

An increase in female labor force participation was the primary impetus for World War II-era childcare funding. The employment upsurge coincided with the federal government’s campaign (headlined by Rosie the Riveter) that urged women to aid in the war effort by joining the workforce. Initially, the federal government was reluctant to encourage the employment of mothers with young children, but demands for new workers, especially when issued by aircraft, ship, and bomber manufacturers, proved powerful. These employers also cited absenteeism among women workers as proof of the need for childcare and other household services.

Although the Lanham Act did not explicitly include childcare facilities, the FWA sought and obtained recognition of this service as a necessity of war. The FWA informed nursery school operators that if they could make a war-connected case for their services, they could apply for funds under the Lanham Act. Wartime childcare facilities were also encouraged through short-lived and relatively small grants to state welfare and education agencies. Congress eliminated this funding source in mid-1943 but, at the same time, appropriated new childcare funding under the Lanham Act. It was this funding that was appropriated on June 29, 1943.

An image of Rosie the Riveter that appeared in a 1943 issue of the magazine Hygeia (American Medical Association)

Congressional approval of childcare funds explicitly tied these funds to wartime need, and the FWA repeatedly expressed its commitment to this limitation. Especially after the drop in war production needs following the spring 1945 Allied victory in Europe, the FWA granted fewer project approvals or renewals. In mid-August 1945, once victory in Japan was assured, the agency announced all Lanham Act funding of childcare centers would cease as soon as possible, but in no case later than the end of October 1945. Approximately 1 month after this announcement, the FWA reported it had received 1,155 letters, 318 wires, 794 postcards, and petitions signed by 3,647 individuals urging continuation of the program. Principle reasons given were the need of servicemen’s wives to continue employment until their husbands returned, the ongoing need of mothers who were the sole support of the children, and inadequacy of other forms of care in the community.

The extensive national protest, a concern for families of servicemen still overseas, and lobbying by officials (especially from California where multiple major war manufacturing sites were based) prompted an extended federal commitment of approximately $7 million. The new funds allowed programs to continue operation with federal subsidy until the end of February 1946.

Many consider the Lanham programs to be of landmark importance. Historically, the U.S. government has supported childcare primarily either to promote poor children’s education or to push poor women into the labor force. The Lanham program broke ground as the first and, to date, only time in American history when parents could send their children to federally-subsidized childcare, regardless of income, and do so affordably. By late 1944, a mother could send a child of two to five years of age to childcare for 50 cents per day (about $7 today). That included lunch and snacks in the morning and afternoon.

Milk time” at the day care center in June of 1943. (Gordon Parks / Library of Congress / The Crowley Company)

The Lanham-funded centers also changed public sentiment about child-rearing. Previously, daycare had been considered a pitiful provision for poor mothers. But, the centers served families across the socioeconomic spectrum and, thus, familiarized the public with sending young children away from the home for part of the day.

Additionally, these centers are seen as historically important because they sought to address the needs of both children and mothers. Rather than simply functioning as holding pens for children while their mothers were at work, the Lanham child care centers were found to have a strong and persistent positive effect on the well-being of children.


The Real Reason the U.S. Has Employer-Sponsored Health Insurance

The basic structure of the American health care system, in which most people have private insurance through their jobs, might seem historically inevitable, consistent with the capitalistic, individualist ethos of the nation.

In truth, it was hardly preordained. In fact, the system is largely a result of one event, World War II, and the wage freezes and tax policy that emerged because of it. Unfortunately, what made sense then may not make as much right now.

Well into the 20th century, there just wasn’t much need for health insurance. There wasn’t much health care to buy. But as doctors and hospitals learned how to do more, there was real money to be made. In 1929, a bunch of hospitals in Texas joined up and formed an insurance plan called Blue Cross to help people buy their services. Doctors didn’t like the idea of hospitals being in charge, so some in California created their own plan in 1939, which they called Blue Shield. As the plans spread, many would purchase Blue Cross for hospital services, and Blue Shield for physician services, until they merged to form Blue Cross and Blue Shield in 1982.

Most insurance in the first half of the 20th century was bought privately, but few people wanted it. Things changed during World War II.

In 1942, with so many eligible workers diverted to military service, the nation was facing a severe labor shortage. Economists feared that businesses would keep raising salaries to compete for workers, and that inflation would spiral out of control as the country came out of the Depression. To prevent this, President Roosevelt signed Executive Order 9250, establishing the Office of Economic Stabilization.

This froze wages. Businesses were not allowed to raise pay to attract workers.

Businesses were smart, though, and instead they began to use benefits to compete. Specifically, to offer more, and more generous, health care insurance.

Then, in 1943, the Internal Revenue Service decided that employer-based health insurance should be exempt from taxation. This made it cheaper to get health insurance through a job than by other means.

After World War II, Europe was devastated. As countries began to regroup and decide how they might provide health care to their citizens, often government was the only entity capable of doing so, with businesses and economies in ruin. The United States was in a completely different situation. Its economy was booming, and industry was more than happy to provide health care.

This didn’t stop President Truman from considering and promoting a national health care system in 1945. This idea had a fair amount of public support, but business, in the form of the Chamber of Commerce, opposed it. So did the American Hospital Association and American Medical Association. Even many unions did, having spent so much political capital fighting for insurance benefits for their members. Confronted by such opposition from all sides, national health insurance failed — for not the first or last time.

In 1940, about 9 percent of Americans had some form of health insurance. By 1950, more than 50 percent did. By 1960, more than two-thirds did.

One effect of this system is job lock. People become dependent on their employment for their health insurance, and they are loath to leave their jobs, even when doing so might make their lives better. They are afraid that market exchange coverage might not be as good as what they have (and they’re most likely right). They’re afraid if they retire, Medicare won’t be as good (they’re right, too). They’re afraid that if the Affordable Care Act is repealed, they might not be able to find affordable insurance at all.

This system is expensive. The single largest tax expenditure in the United States is for employer-based health insurance. It’s even more than the mortgage interest deduction. In 2017, this exclusion cost the federal government about $260 billion in lost income and payroll taxes. This is significantly more than the cost of the Affordable Care Act each year.

This system is regressive. The tax break for employer-sponsored health insurance is worth more to people making a lot of money than people making little. Let’s take a hypothetical married pediatrician with a couple of children living in Indiana who makes $125,000 (which is below average). Let’s also assume his family insurance plan costs $15,000 (which is below average as well).

The tax break the family would get for insurance is worth over $6,200. That’s far more than a similar-earning family would get in terms of a subsidy on the exchanges. The tax break alone could fund about two people on Medicaid. Moreover, the more one makes, the more one saves at the expense of more spending by the government. The less one makes, the less of a benefit one receives.

The system also induces people to spend more money on health insurance than other things, most likely increasing overall health care spending. This includes less employer spending on wages, and as health insurance premiums have increased sharply in the last 15 years or so, wages have been rather flat. Many economists believe that employer-sponsored health insurance is hurting Americans’ paychecks.

There are other countries with private insurance systems, but none that rely so heavily on employer-sponsored insurance. There are almost no economists I can think of who wouldn’t favor decoupling insurance from employment. There are any number of ways to do so. One, beloved by wonks, was a bipartisan plan proposed by Senators Ron Wyden, a Democrat, and Robert Bennett, a Republican, in 2007. Known as the Healthy Americans Act, it would have transitioned everyone from employer-sponsored health insurance to insurance exchanges modeled on the Federal Employees Health Benefits Program.

Employers would not have provided insurance. They would have collected taxes from employees and passed these onto the government to pay for plans. Everyone, regardless of employment, would have qualified for standard deductions to help pay for insurance. Employers would have been required to increases wages over two years equal to what had been shunted into insurance. Those at the low end of the socio-economic spectrum would have qualified for further premium help.

This isn’t too different from the insurance exchanges we see now, writ large, for everyone. One can imagine that such a program could have also eventually replaced Medicaid and Medicare.

There was a time when such a plan, being universal, would have pleased progressives. Because it could potentially phase out government programs like Medicaid and Medicare, it would have pleased conservatives. When first introduced in 2007, it had the sponsorship of nine Republican senators, seven Democrats and one independent. Such bipartisan efforts seem a thing of the past.

We could also shift away from an employer-sponsored system by allowing people to buy into our single-payer system, Medicare. That comes with its own problems, as The Upshot’s Margot Sanger-Katz has written. She also has covered the issues of shifting to a single-payer system more quickly.

It’s important to point out that neither of these options has anything even close to bipartisan support.

Without much pressure for change, it’s likely the American employer-based system is here to stay. Even the Affordable Care Act did its best not to disrupt that market. While the system is far from ideal, Americans seem to prefer the devil they know to pretty much anything else.


Free childcare in the US: A forgotten dream?

It sounds like a progressive dream found only in Scandinavia - employers who offer on-site, affordable, education-based childcare 24-hours a day.

Healthy and affordable pre-cooked meals are sent home with the children so parents don't have to cook.

In fact, such a scenario was an American reality for workers during World War Two. President Obama used his State of the Union address this year to laud the programmes and urge the nation to "stop treating childcare as a side issue, or a women's issue" and to make it a national economic priority.

"During World War Two, when men like my grandfather went off to war, having women like my grandmother in the workforce was a national security priority - so this country provided universal childcare," President Obama said. "In today's economy, when having both parents in the workforce is an economic necessity for many families, we need affordable, high-quality childcare more than ever."

In the budget released this week. Mr Obama didn't deliver universal care - instead, he offered an increased tax credit for childcare costs.

World War Two marked the first and only time in US history that the government funded childcare for any working mother, regardless of her income. The programmes were created as women entered the workforce in droves to help with the war effort.

The Kaiser Company Shipyards in Oregon and California were at the vanguard of progressive childcare during World War Two. They hired skilled teachers and some offered pre-made meals for mums to take home and heat up. Some even offered personal shoppers to help women juggle home life and the building of combat ships.

First Lady Eleanor Roosevelt was so impressed that she visited several of the Kaiser centres and urged companies across the United States to emulate the model.

"I don't understand why it hasn't come back," says Natalie Fousekis, a historian and author of Demanding Childcare: Women's Activism and the Politics of Welfare, 1940-1971. "I understand the political forces have prevented it. The studies I have seen and the studies during and after World War Two and today show that investing in good childcare with an educational component to it pays off in all kinds of ways."

If you ask single mothers trying to make it on a low wage job, she says, their biggest concern is childcare. "It adds to the haves and have nots in this country," Fousekis says.

Paying for eggs, but not children

The lack of quality, affordable childcare in the United States heated up since news emerged last year that tech giants Apple and Facebook will pay up to $20,000 for female employees to freeze their eggs if they want or need to delay motherhood.

While many applauded the companies for covering the expensive (and invasive) procedure, it left many uneasy and wondering - if you're going to pay to put a woman's fertility on ice while she helps you make a profit, shouldn't you reconsider your corporate culture to make it easier to be a working mother?

"We offer to freeze your eggs - it's saying how can we fix this now easily and quickly instead of being systematic," says Ariane Hegewisch, a study director at the Institute for Women's Policy Research, who says companies need to do more to help working parents have flexible hours and childcare options. "It's a very technology-centred, short-term approach."

Facebook does offer flexible hours and perks for new parents - but it does not have onsite childcare centres - nor does it have doggie day care as was (widely and falsely) reported since news of the freezing eggs emerged.

The Truman administration planned to yank all the federal funding for childcare after World War Two, which sparked protests across the country, mainly by women. The protest movement failed everywhere but in California where women successfully fought to keep the centres open. But over the years, funding has been chipped away and the California centres decimated.

At the site of the old Kaiser Shipyards in Richmond, California, there is an exhibit dedicated to the childcare centres at the Rosie the Riveter/WW2 Home Front National Historical Park. Women working in traditional men's jobs during the war were known as "Rosies", often depicted in propaganda posters as a powerful woman in a red bandana welding, driving a truck or flexing her muscles.

Lucien Sonder guides tours at the park (where a handful of Rosies still hold court on Fridays) and says people are often shocked to hear what good services the women had at the shipyards.

"They hired professionals. They were trying to create something that wouldn't just be childcare - they deliberately wanted to call it a Child Development Centre," Ms Sonder says. "They called them at the time eight-hour orphans because many believed these women were abandoning their children for eight hours."

Eleanor Roosevelt was a loud champion of the "perks" and urged companies across the nation to emulate Kaiser's model, with women building combat ships as their children learned ABCs nearby.

"If we do not pay for children in good schools, then we are going to pay for them in prisons and mental hospitals," Eleanor Roosevelt said.

But as women returned home when the war ended, the need for childcare seemed less pressing.

In 1975, more than half of all children had a stay-at-home parent, typically the mother. Today, fewer than one-in-three American children have a stay-at-home parent. And parents often cite childcare as their biggest expense and cause of stress. (And a recent rise in the number of stay-at-home mothers is likely partly due to the high cost of childcare for children under the age of five).

Some US employers offer on-site childcare - Amgen and Patagonia are the Kaiser Company Shipyards of today, setting a gold standard in progressive, on-site care, but advocates say that's an anomaly. Hollywood studios and universities and some US government offices also offer on-site childcare centres. But they are not typical - and the centres are often expensive and competitive to get into.

"In the US, we have the cowboy mentality and we're allergic to things that feel like social welfare and it's very counter productive," says Sandra Burud, an author who has studied the cost-effectiveness of on-site childcare. She has advised big companies to build centres to attract and retain employees.

Ms Burud says it's a mistake to frame corporate or government-funded childcare as a "perk" and that it should be a staple of the workforce.

"You shouldn't think of it as a benefit. It's like having a parking lot or having a cafeteria - not every one uses it - but it's a tool for the business," she says.


History shows that we can solve the child-care crisis — if we want to

As novel coronavirus cases rise, we are grappling with a growing child-care crisis. Heading into the fall, most public schools have announced their plans. In some places, students will only attend in-person classes intermittently. Elsewhere, schools will be entirely online. Others intend to fully resume in-person classes, despite growing alarm about the health risks. Meanwhile, day-care centers remain closed, are operating at a limited capacity or may permanently shut down due to the pandemic.

Parents who must leave their homes to work at grocery stores, construction sites, nursing homes, restaurants and hospitals face the greatest challenges finding care for their children. But even those who can telework from their living rooms are finding it nearly impossible to parent full-time and meet the demands of their employers — and this is especially true for women.

This is not the first time that the country has faced an extraordinary challenge that demands a fundamental redesigning of everyday life — including new arrangements for child care. When women mobilized during World War II and during President Lyndon Johnson’s “War on Poverty,” the U.S. recognized that expanding child care was essential to our collective well-being and the flourishing of the economy. The results were policies that supported families through dramatic extensions of child care.

These policies involved financial investments and a fundamental rethinking of the responsibility of the federal government. And since women shouldered the primary responsibility for child-rearing and were among the majority of the staff of child-care programs, the initiatives also helped foster greater gender equality. Today we should recall this history and recognize that the government can facilitate a child-care revolution in a matter of weeks. The question is whether we will learn from the past and create more durable solutions.

Until the 1930s, the government did little to help employed mothers. The privately funded day nurseries that had been established to serve the poor held the stigma of “charity” and were seen as a last resort for mothers unable to fulfill their “proper” roles as full-time caregivers. During the Great Depression, as part of the larger federal effort to create jobs for the unemployed, the government established an Emergency Nursery School program. But pressure to provide day care more broadly did not emerge until massive numbers of married women entered the workforce.

During World War II, many of the 6.5 million women who joined the labor force to assist the war effort struggled to make child-care arrangements. With mothers clamoring for help and the media running sensationalist stories of unattended children left in locked cars or roaming the streets, federal authorities acted. In 1943, they amended a 1940 law called the Lanham Act to allow for the establishment of federally subsidized child-care centers.

Working families were eligible for child care for up to six days a week. The federal government provided most of the funding with centers typically making up the difference by charging $9 to $10 per day (in today’s dollars). Most programs provided children with meals and educationally enriching activities. Many offered 12 hours of coverage each day a few even provided 24-hour care to serve women working night shifts.

To staff the program, the government hired women with experience teaching and caring for children. As demand for services grew, authorities established 10- to 12-week training programs to prepare new staff. Although racial discrimination was rampant in most war industries, some of these child-care centers hired black, Asian American and Native American women. Some centers were racially integrated, while others were segregated. About 259 centers served only black children.

Before the 1940s, day care had been stigmatized due to the widely held belief that white women’s proper role was in the home and raising their children. Yet the provision of federally subsidized child care helped chip away at these attitudes. Surveys revealed that parents were overwhelmingly satisfied with the centers and wanted them to continue.

Yet their opinions did not change the minds of federal policymakers, who considered child care a temporary emergency measure. At the end of the war, the government stopped funding the centers, unwilling to further encourage the employment of mothers. Many feared that if women remained in the labor force, men returning from the war would find it harder to secure employment.

News that the government centers would close incited vigorous protests from mothers and child-care workers. A few cities raised funds to try to keep their centers open, and in California, a coalition of parents, labor unions, early-childhood-education experts, African Americans, women’s groups and Communist Party members convinced state authorities to step in to provide financial support. But the idea of universal federally funded child care was off the table.

During the 1960s, the federal government created a new child-care program for very different reasons. Plans for Head Start emerged as part of Johnson’s War on Poverty, which was a response to civil rights activism, as well as the growing public awareness of the severity of poverty in the United States. Upon learning that 50 percent of people in poverty were children, authorities concluded that early intervention could make a difference.

The Head Start program offered children living in poverty healthy meals and high-quality education from a young age. It encouraged community and parental involvement and offered jobs to thousands of low-income mothers. To avoid controversies over the government’s responsibility for child care for working mothers, advocates described it as an anti-poverty measure that would prepare disadvantaged children to succeed at school.

Thanks to the wives of Cabinet members and members of Congress, Head Start got off the ground within 12 weeks. Applications from communities and sponsors soon came pouring in, filling the bathtubs at the D.C. hotel that served as the program’s makeshift headquarters. The government contracted with 140 universities to provide training programs for teachers, the majority of whom were low-income women. The program soon reached more than 3,000 communities, serving more than half-a-million children.

Claiming that Head Start would remedy the long-term effects of racism and economic inequality, the program’s champions tended to promise much more than it could deliver. Nevertheless, in the past 50 years, studies have shown Head Start’s positive effects on more than 32 million children and their parents, as well as the women who staff the programs.

In 1971, federal legislators built on Head Start with a bill establishing the foundations of a universal federally funded day-care system. The Comprehensive Child Development Act was passed by Congress in a bipartisan vote, with strong support from civil rights, labor and women’s groups. Yet President Richard Nixon vetoed the bill, insisting on a “family-centered” approach instead of a publicly funded solution to the growing child-care problem. This decision gave rise to our current system, in which the government subsidizes a portion of the expenses that parents incur, but only a fraction of the poorest families receive direct child-care subsidies.

Today, in nearly two-thirds of households with children, the parents are employed. In 3 out of 5 states, the cost of day care for one infant is more than tuition and fees at four-year public universities. And in the midst of a global pandemic, many centers have been forced to close or scale back their offerings. Many parents also can no longer count on the public schools to provide consistent care for children in grades K-8, who cannot be left home unattended. Mothers’ employment will be most affected by this crisis, as women remain disproportionately responsible for child care and will be the ones forced to cut back or leave the workforce entirely.

Past solutions will not perfectly suit our current needs. The ongoing public health crisis means that many families can’t safely send children into care away from home and others will require flexible care that can adjust quickly to changing work and medical conditions. If covid-19 cases continue to increase, the federal government may need to shut nearly everything down (including schools) and pay all but the most essential workers to stay home.


History shows that we can solve the child-care crisis — if we want to

As novel coronavirus cases rise, we are grappling with a growing child-care crisis. Heading into the fall, most public schools have announced their plans. In some places, students will only attend in-person classes intermittently. Elsewhere, schools will be entirely online. Others intend to fully resume in-person classes, despite growing alarm about the health risks. Meanwhile, day-care centers remain closed, are operating at a limited capacity or may permanently shut down due to the pandemic.

Parents who must leave their homes to work at grocery stores, construction sites, nursing homes, restaurants and hospitals face the greatest challenges finding care for their children. But even those who can telework from their living rooms are finding it nearly impossible to parent full-time and meet the demands of their employers — and this is especially true for women.

This is not the first time that the country has faced an extraordinary challenge that demands a fundamental redesigning of everyday life — including new arrangements for child care. When women mobilized during World War II and during President Lyndon Johnson’s “War on Poverty,” the U.S. recognized that expanding child care was essential to our collective well-being and the flourishing of the economy. The results were policies that supported families through dramatic extensions of child care.

These policies involved financial investments and a fundamental rethinking of the responsibility of the federal government. And since women shouldered the primary responsibility for child-rearing and were among the majority of the staff of child-care programs, the initiatives also helped foster greater gender equality. Today we should recall this history and recognize that the government can facilitate a child-care revolution in a matter of weeks. The question is whether we will learn from the past and create more durable solutions.

Until the 1930s, the government did little to help employed mothers. The privately funded day nurseries that had been established to serve the poor held the stigma of “charity” and were seen as a last resort for mothers unable to fulfill their “proper” roles as full-time caregivers. During the Great Depression, as part of the larger federal effort to create jobs for the unemployed, the government established an Emergency Nursery School program. But pressure to provide day care more broadly did not emerge until massive numbers of married women entered the workforce.

During World War II, many of the 6.5 million women who joined the labor force to assist the war effort struggled to make child-care arrangements. With mothers clamoring for help and the media running sensationalist stories of unattended children left in locked cars or roaming the streets, federal authorities acted. In 1943, they amended a 1940 law called the Lanham Act to allow for the establishment of federally subsidized child-care centers.

Working families were eligible for child care for up to six days a week. The federal government provided most of the funding with centers typically making up the difference by charging $9 to $10 per day (in today’s dollars). Most programs provided children with meals and educationally enriching activities. Many offered 12 hours of coverage each day a few even provided 24-hour care to serve women working night shifts.

To staff the program, the government hired women with experience teaching and caring for children. As demand for services grew, authorities established 10- to 12-week training programs to prepare new staff. Although racial discrimination was rampant in most war industries, some of these child-care centers hired black, Asian American and Native American women. Some centers were racially integrated, while others were segregated. About 259 centers served only black children.

Before the 1940s, day care had been stigmatized due to the widely held belief that white women’s proper role was in the home and raising their children. Yet the provision of federally subsidized child care helped chip away at these attitudes. Surveys revealed that parents were overwhelmingly satisfied with the centers and wanted them to continue.

Yet their opinions did not change the minds of federal policymakers, who considered child care a temporary emergency measure. At the end of the war, the government stopped funding the centers, unwilling to further encourage the employment of mothers. Many feared that if women remained in the labor force, men returning from the war would find it harder to secure employment.

News that the government centers would close incited vigorous protests from mothers and child-care workers. A few cities raised funds to try to keep their centers open, and in California, a coalition of parents, labor unions, early-childhood-education experts, African Americans, women’s groups and Communist Party members convinced state authorities to step in to provide financial support. But the idea of universal federally funded child care was off the table.

During the 1960s, the federal government created a new child-care program for very different reasons. Plans for Head Start emerged as part of Johnson’s War on Poverty, which was a response to civil rights activism, as well as the growing public awareness of the severity of poverty in the United States. Upon learning that 50 percent of people in poverty were children, authorities concluded that early intervention could make a difference.

The Head Start program offered children living in poverty healthy meals and high-quality education from a young age. It encouraged community and parental involvement and offered jobs to thousands of low-income mothers. To avoid controversies over the government’s responsibility for child care for working mothers, advocates described it as an anti-poverty measure that would prepare disadvantaged children to succeed at school.

Thanks to the wives of Cabinet members and members of Congress, Head Start got off the ground within 12 weeks. Applications from communities and sponsors soon came pouring in, filling the bathtubs at the D.C. hotel that served as the program’s makeshift headquarters. The government contracted with 140 universities to provide training programs for teachers, the majority of whom were low-income women. The program soon reached more than 3,000 communities, serving more than half-a-million children.

Claiming that Head Start would remedy the long-term effects of racism and economic inequality, the program’s champions tended to promise much more than it could deliver. Nevertheless, in the past 50 years, studies have shown Head Start’s positive effects on more than 32 million children and their parents, as well as the women who staff the programs.

In 1971, federal legislators built on Head Start with a bill establishing the foundations of a universal federally funded day-care system. The Comprehensive Child Development Act was passed by Congress in a bipartisan vote, with strong support from civil rights, labor and women’s groups. Yet President Richard Nixon vetoed the bill, insisting on a “family-centered” approach instead of a publicly funded solution to the growing child-care problem. This decision gave rise to our current system, in which the government subsidizes a portion of the expenses that parents incur, but only a fraction of the poorest families receive direct child-care subsidies.

Today, in nearly two-thirds of households with children, the parents are employed. In 3 out of 5 states, the cost of day care for one infant is more than tuition and fees at four-year public universities. And in the midst of a global pandemic, many centers have been forced to close or scale back their offerings. Many parents also can no longer count on the public schools to provide consistent care for children in grades K-8, who cannot be left home unattended. Mothers’ employment will be most affected by this crisis, as women remain disproportionately responsible for child care and will be the ones forced to cut back or leave the workforce entirely.

Past solutions will not perfectly suit our current needs. The ongoing public health crisis means that many families can’t safely send children into care away from home and others will require flexible care that can adjust quickly to changing work and medical conditions. If covid-19 cases continue to increase, the federal government may need to shut nearly everything down (including schools) and pay all but the most essential workers to stay home.


Watch the video: Americas History With Universal Child Care