Tuesday, April 12, 2016

Were they really invented at the 1904 World's Fair? Probably not


By Merrill Fabry

Here’s something to ponder if you happen to be waiting in line Ben & Jerry’s Free Cone Day on Tuesday: Where do ice cream cones come from?

The exact origin of the combination of an edible wafer or waffle, the cone shape and the ice cream is unclear. Edible wafers date back centuries, and they were long served alongside ice cream as an accompaniment. The most popular cone invention stories locate ice cream cones at the 1904 World’s Fair, but they were almost certainly invented before that date and the actual story of who first used a rolled wafer to serve ice cream is a murky story.

An 1807 etching of the Parisian Café Frascati holds an early clue. The café was known for serving ice cream, and the lower right corner of the image shows a woman licking something out of a hand held container, which ice cream historian Robin Weir writes is the “first pictorial evidence for ice cream cones.”

Almost 40 years later, Charles Francatelli’s 1846 book Modern Cook described how to make “wafer-gauffres” filled with “filbert-cream-ice” to garnish a molded desert called “Iced Pudding a la Duchess of Kent.” (His “Iced Pudding a la Chesterfield” had a similar finish of ice-cream-filled wafer cones.) The accompanying illustration clearly showed the cone shape, and his gauffres recipe described shaping still-warm wafers into “small cornucopiae” before they cooled and became brittle, but these still aren’t quite the handheld cones of your youth, as they were intended mostly as a garnish. Towards the end of the century, Agnes B. Marshall wrote numerous cookbooks, including ones specifically about iced deserts, and this time the filled cone took center plate. Her 1887 Cookery Book included a recipe for “Cornets with Cream” that included the instruction that the cornets could “be filled with any cream or water ice,” and her 1894 book Fancy Ices detailed further recipes for cornets filled with ice creams—but, while they were a step closer to the current incarnation of the ice cream cone, they were still elegant desserts meant to be put on a plate and eaten with utensils.

In the second half of the 19th century, ice cream bought outside the home was served in a small glass container, called a “penny lick” for the price. You would lick the ice cream out of the glass and return it to the vendor to wash and reuse. This practice wasn’t necessarily sanitary, and it could cause delays if too many people wanted ice cream at the same time.



The solution: an edible container. Italo Marchiony, who later claimed he had been making edible cups to serve ice cream in New York City since 1896, filed a patent in 1903 for his own ice-cream-cup-making machine. The mold’s shape is something more akin to a cup than a cone, as was the “apparatus for baking biscuit-cups for ice-cream” that Antonio Valvona of Manchester, England had patented a year before.

Anne Funderburg’s history of ice cream describes seven origin myths for the ice cream cone, starting with the most common 1904 St. Louis World’s Fair story: that Ernest Hamwi, an immigrant from Syria, was making zalabia, a wafer dessert, next to an ice cream stand. Hamwi suggested the combination, which caught on. Hamwi continued to work in the business of ice cream cones, eventually opening the Missouri Cone Company. The International Association of Ice Cream Manufacturers named him the official creator of the ice cream cone in the 1950s. Still, other stories proliferated, including ones claiming Abe Doumar of Lebanon, Nick and Albert Kabbaz of Syria, David Avayou from Turkey, or Frank and Charles Menches of Ohio gave Hamwi the idea or invented the cone themselves. There’s no contemporary evidence to support any of the World’s Fair stories over another.

It remains unclear exactly which invention story is the true story. But, while the ice cream cone was probably not first invented at the 1904 World’s Fair, the fair did serve to popularize the cone as container, and the stories about the Fair do provide an early example of the ice cream cone recognizable for both its modern form and function.

Wednesday, March 30, 2016

It's not 'politically correct' to say Australia was invaded, it's history

So, the arbiters of political correctness gone mad have apparently decided we need a quick top-up lesson on Australian Indigenous history. Or something.
It’s not quite clear what, precisely, they think.
“University of NSW students told to refer to Australia as having been ‘invaded’”, screams today’s headline in Sydney’s Daily Telegraph about a guide at the university for “appropriate language use for the history, society, naming, culture and classifications of Indigenous Australian and Torres Strait Islander people”.
Have a look at the guide and judge for yourself.
You might agree with all of it, some of it or none of it. Or you might not care either way. I’m fine with most of it and that which I’d contest if I could be bothered – such as the “Dreamings” being more appropriate than the “Dreamtime” – are neither here not there.
But, horror, the Tele warns – “students are being told to refer to Australia as having been ‘invaded’ instead of settled in a highly controversial rewriting of official Australian history”.
They even use conservative historian Keith Windschuttle and (wait for it) the Institute of Public Affairs to help make their non-case.
Highly controversial? Really? Nah.
And here we were wondering if there had been a sudden re-ignition of the“history wars” (to which Windschuttle and IPA were central) when debate over (warning, I’m about to do it) European invasion and dispossession centred on the National Museum of Australia and polarised historians between the “white blindfold” and “black armband” camps.
And over what? Some guide that might help naïve university students think before they speak about matters relating to Indigenous Australians. To my mind this would be a good thing, given the hand comparatively recent continental history has dealt Aboriginal and Torres Strait Islander people.
And we’d be right. Australia has largely moved on from the times under John Howard’s prime ministership when the museum was riven by acrimonious argument about how it ought to depict frontier history, and whether the murders of tens of thousands of Indigenous Australians by British soldiers and “settlers” constituted war on the colonial frontier.
An instructive starting point: Indigenous warriors who resisted invasion certainly regarded it as war, as did numerous colonial authorities including governors, not least Lachlan Macquarie – a vicious, calculated murderer of his colony’s Indigenous people.
While conservative estimates would put Indigenous deaths at the hands of soldiers, “native police”, militia, explorers, miners and farmers at 30,000, recent credible academic research indicates the figure in Queensland alone was 65,000. Although violence against Aboriginal and Torres Strait Islander people was most extreme in Queensland, a conservative national extrapolation potentially adds another deeply unsettling dimension to Australia’s malevolent recent history.
My starting point as a non-Indigenous person who writes about Aboriginal and Torres Strait Islander people and their stories, has always been to listen. To listen to the ways stories are told by Indigenous people themselves, to understand their meaning and to respect the way they view – and share – their histories.
Respect is the critical word here. And that has nothing to do with being politically correct. Respect, starting with capital-I for Indigenous (I have never met an Aboriginal or Torres Strait Islander person who did not want their people thus described). Neither have I come across too many Aboriginal and Torres Strait Islander people (not to mention a growing number of non-Indigenous Australians) who refer to the arrival of the first fleet in 1788, and all of the ensuing extreme violence and dispossession, as anything other than “invasion”.
The growing debate around the celebration of Australia Day each 26 January (Invasion Day to many Indigenous Australians) including in the pages of this country’s more reactionary journals, indicate just how much the argument has advanced since the history wars. Such change can never, of course, evolve too fast.
“They [students] are also told it is offensive to suggest James Cook ‘discovered’ Australia,” the Tele tells us.
Get out! Where to begin?
Maybe ask the Aboriginal and Torres Strait Islanders themselves or, indeed, the Macassans from Sulawesi with whom they traded for centuries before Cook anchored his Endeavour at Botany Bay in autumn 1770. Others, including the Dutch, might also have a view about first non-Asian contact and European “discovery”.
Yes, as the UNSW guide suggests, Cook mapped the east coast of this continent. But he hardly discovered it.
Instructively, that moment of first east coast British-Indigenous contact was signified with violence when Cook’s men shot at and wounded at least one Gweagel tribesmen. Cook took their spears and a shield. The shield, part of the British Museum’s Indigenous collection (the spoils always go to the victors), was recently the centrepiece of a display at the national museum exhibition, Encounters. The shield has a notable hole in it.
The museum reckons it’s from a lance.
But the Gweagel, who want that stolen shield permanently returned, will tell you it’s from a musket round.
I know who I believe.

Tuesday, March 29, 2016

Invasion of the history rewriters

By CLARISSA BYE

STUDENTS at a leading NSW university are being told to refer to Australia as having been “invaded” instead of settled in a highly controversial rewriting of official Australian history.

They are also told it is offensive to suggest James Cook “discovered” Australia and inappropriate to say the indigenous people have lived here for 40,000 years.

Instead, they should say “since the beginning of the Dreamings”.

A so-called Diversity Toolkit on indigenous terminology for University of NSW undergraduates argues that Australian history should be broken up into categories, including “pre-invasion” and “post-invasion”.

It also claims the word settlement ignores the reality of indigenous lands “being stolen”.

“Australia was not settled peacefully, it was invaded, occupied and colonised,” according to the guidelines, which are prescribed reading for some undergraduate students.

“Describing the arrival of the Europeans as a ‘settlement’ attempts to view Australian history from the shores of England rather than the shores of Australia,” the document says. “Most Aboriginal people find the use of the word ‘discovery’ offensive."

Students are also being taught the terms “Aborigines” and “Aboriginal people” are inappropriate, and they should use the term “indigenous Australian people”.

The phrase “The Dreamings” is apparently more appropriate than “Dreamtime”, because the latter tended to indicate a time period that has finished.

The accepted historical period of 40,000 years is also rejected because it “puts a limit on the occupation of Australia and tends to lend support to migration theories and anthropological assumptions”.

But historian Keith Windschuttle said the term “invasion” was wrong. “Under international law, Australia has always been regarded as a settled country according to the leading judgments in international law, both here and around the world,” he said.

“Until the law changes, there is no sound basis on which to say invaded. That is wrong.”

Institute of Public Affairs research fellow Matthew Lesh criticised the guidelines, saying they suffocate “the free flow of ideas”.

Federal Education Minister Simon Birmingham said universities “enjoy autonomy when it comes to academic concepts”, however he stressed they should be a place where “ideas are contested and open to debate”.

A UNSW spokeswoman said the guides were “commonplace” across universities.

Libraries facing 'greatest crisis' in their history

The Guardian

Nearly 350 libraries have closed in Britain over the past six years, causing the loss of almost 8,000 jobs, according to new analysis.

In a controversial move that sparked protests by authors including Philip Pullman and Zadie Smith, councils across the country have shut their reading rooms in an effort to make deep savings.

Children’s author Alan Gibbons warned the public library service faced the “greatest crisis in its history”.

The figures, obtained by the BBC English Regions data journalism team, showed that 343 libraries have shut since 2010 and another 111 closures are planned this year.


Sign up to our Bookmarks newsletter
 Read more
A further 174 libraries have been transferred to community groups and are run by an army of volunteers, while 50 have been handed to external organisations.

Gibbons, who wrote Shadow Of The Minotaur, told the BBC: “Opening hours are slashed, book stocks reduced. Volunteers are no longer people who supplement full time staff but their replacements. This constitutes the hollowing out of the service. We are in dangerous territory.”

Librarian Ian Anstice, who runs the Public Libraries News website, said the cuts were “without precedent”. He said: “Councils learnt early on how unpopular simply closing libraries is, so they have had to cut the vital service in other, less obvious ways.

“It can come across in many forms – reduced opening hours, reduced book fund, reduced maintenance and reduced staffing. In all its incarnations, it is harmful to the service, creating the risk that once-loyal users of libraries will come away disappointed and stop using them.

“Our public library system used to be envy of the world. Now it is used as a cautionary tale that librarians use worldwide to scare their colleagues.”

Four areas – Sefton in Merseyside, Brent in north-west London, Stoke-on-Trent and Sunderland – have lost more than half of their libraries since 2010, the BBC data team said.

A spokesman for the Department for Media, Culture and Sport said: “Libraries are cornerstones of their communities and are part of the fabric of our society, so it’s vital they continue to innovate in order to meet the changing demands of those they serve.

“Government is helping libraries to modernise by funding a Wi-Fi roll-out across England that has benefited more than 1,000 libraries and increasing access to digital services and e-lending.”

Wednesday, December 23, 2015

Birmingham's ancient Koran history revealed

When the University of Birmingham revealed that it had fragments from one of the world's oldest Korans, it made headlines around the world.

By Sean Coughlan

In terms of discoveries, it seemed as unlikely as it was remarkable.
But it raised even bigger questions about the origins of this ancient manuscript.

And there are now suggestions from the Middle East that the discovery could be even more spectacularly significant than had been initially realised.


There are claims that these could be fragments from the very first complete version of the Koran, commissioned by Abu Bakr, a companion of the Prophet Muhammad - and that it is "the most important discovery ever for the Muslim world".

This is a global jigsaw puzzle.
But some of the pieces have fallen into place.

It seems likely the fragments in Birmingham, at least 1,370 years old, were once held in Egypt's oldest mosque, the Mosque of Amr ibn al-As in Fustat.

Paris match


This is because academics are increasingly confident the Birmingham manuscript has an exact match in the National Library of France, the Bibliotheque Nationale de France.

Mosque of Amr ibn al-As in Fustat
Image by BBC

The library points to the expertise of Francois Deroche, historian of the Koran and academic at the College de France, and he confirms the pages in Paris are part of the same Koran as Birmingham's.
Alba Fedeli, the researcher who first identified the manuscript in Birmingham, is also sure it is the same as the fragments in Paris.

The significance is that the origin of the manuscript in Paris is known to have been the Mosque of Amr ibn al-As in Fustat.

'Spirited away'



The French part of this manuscript was brought to Europe by Asselin de Cherville, who served as a vice consul in Egypt when the country was under the control of Napoleon's armies in the early 19th Century.

Prof Deroche says Asselin de Cherville's widow seemed to have tried to sell this and other ancient Islamic manuscripts to the British Library in the 1820s, but they ended up in the national library in Paris, where they have remained ever since.

But if some of this Koran went to Paris, what happened to the pages now in Birmingham?
Prof Deroche says later in the 19th Century manuscripts were transferred from the mosque in Fustat to the national library in Cairo.

Along the way, "some folios must have been spirited away" and entered the antiquities market.
These were presumably sold and re-sold, until in the 1920s they were acquired by Alphonse Mingana and brought to Birmingham.

Mingana was an Assyrian, from what is now modern-day Iraq, whose collecting trips to the Middle East were funded by the Cadbury family.

"Of course, no official traces of this episode were left, but it should explain how Mingana got some leaves from the Fustat trove," says Prof Deroche, who holds the legion of honour for his academic work.

And tantalisingly, he says other similar material, sold to western collectors could, still come to light.

Disputed date


But what remains much more contentious is the dating of the manuscript in Birmingham.
What was really startling about the Birmingham discovery was its early date, with radiocarbon testing putting it between 568 and 645.
The latest date in the range is 13 years after the death of the Prophet Muhammad in 632.

Napoleon in Egypt
Image by BBC

David Thomas, Birmingham University's professor of Christianity and Islam, explained how much this puts the manuscript into the earliest years of Islam: "The person who actually wrote it could well have known the Prophet Muhammad."

But the early date contradicts the findings of academics who have based their analysis on the style of the text.

Mustafa Shah, from the Islamic studies department at the School of Oriental and African Studies in London, says the "graphical evidence", such as how the verses are separated and the grammatical marks, show this is from a later date.

In this early form of Arabic, writing styles developed and grammatical rules changed, and Dr Shah says the Birmingham manuscript is simply inconsistent with such an early date.
Prof Deroche also says he has "reservations" about radiocarbon dating and there have been cases where manuscripts with known dates have been tested and the results have been incorrect.

'Confident' dates are accurate


But staff at Oxford University's Radiocarbon Accelerator Unit, which dated the parchment, are convinced their findings are correct, no matter how inconvenient.

Researcher David Chivall says the accuracy of dating has improved in recent years, with a much more reliable approach to removing contamination from samples.

In the case of the Birmingham Koran, Mr Chivall says the latter half of the age range is more likely, but the overall range is accurate to a probability of 95%.
It is the same level of confidence given to the dating of the bones of Richard III, also tested at the Oxford laboratory.

"We're as confident as we can be that the dates are accurate."

And academic opinions can change. Dr Shah says until the 1990s the dominant academic view in the West was that there was no complete written version of the Koran until the 8th Century.

But researchers have since overturned this consensus, proving it "completely wrong" and providing more support for the traditional Muslim account of the history of the Koran.
The corresponding manuscript in Paris, which could help to settle the argument about dates, has not been radiocarbon tested.

The first Koran?


But if the dating of the Birmingham manuscript is correct what does it mean?
There are only two leaves in Birmingham, but Prof Thomas says the complete collection would have been about 200 separate leaves.

Ancient Koran in Birmingham
Image by BBC

"It would have been a monumental piece of work," he said.
And it raises questions about who would have commissioned the Koran and been able to mobilise the resources to produce it.

Jamal bin Huwareib, managing director of the Mohammed bin Rashid Al Maktoum Foundation, an education foundation set up by the ruler of the UAE, says the evidence points to an even more remarkable conclusion.

He believes the manuscript in Birmingham is part of the first comprehensive written version of the Koran assembled by Abu Bakr, the Muslim caliph who ruled between 632 and 634.

David Thomas
Image by BBC

"It's the most important discovery ever for the Muslim world," says Mr bin Huwareib, who has visited Birmingham to examine the manuscript.
"I believe this is the Koran of Abu Bakr."

He says the high quality of the hand writing and the parchment show this was a prestigious work created for someone important - and the radiocarbon dating shows it is from the earliest days of Islam.

"This version, this collection, this manuscript is the root of Islam, it's the root of the Koran," says Mr bin Huwareib.
"This will be a revolution in studying Islam."
This would be an unprecedented find. Prof Thomas says the dating fits this theory but "it's a very big leap indeed".

'Priceless manuscript'


There are other possibilities. The radiocarbon dating is based on the death of the animal whose skin was used for the parchment, not when the writing was completed, which means the manuscript could be a few years later than the age range ending in 645, with Prof Thomas suggesting possible dates of 650 to 655.

This would overlap with the production of copies of the Koran during the rule of the caliph Uthman, between 644 and 656, which were intended to produce an accurate, standardised version to be sent to Muslim communities.

If the Birmingham manuscript was a fragment of one of these copies it would also be a spectacular outcome.
It's not possible to definitively prove or disprove such theories.

But Joseph Lumbard, professor in the department of Arabic and translation studies at the American University of Sharjah, says if the early dating is correct then nothing should be ruled out.
"I would not discount that it could be a fragment from the codex collected by Zayd ibn Thabit under Abu Bakr.

"I would not discount that it could be a copy of the Uthmanic codex.
"I would not discount Deroche's argument either, he is such a leader in this field," says Prof Lumbard.

He also warns of evidence being cherry-picked to support experts' preferred views.
BBC iWonder: The Quran

A timeline of how the Quran became part of British life
Prof Thomas says there could also have been copies made from copies and perhaps the Birmingham manuscript is from a copy made specially for the mosque in Fustat.

Jamal bin Huwaireb sees the discovery of such a "priceless manuscript" in the UK, rather than a Muslim country, as sending a message of mutual tolerance between religions.

"We need to respect each other, work together, we don't need conflict."
But don't expect any end to the arguments over this ancient document.

Friday, July 24, 2015

First Snake Crawled on Four

The world's first known snake has quite recently been found in Brazil, as indicated by new research that settles numerous secrets about the crawling reptiles.

The snake (Tetrapodophis amplectus), depicted in the most recent issue of the diary Science, is likewise the first known snake to have four appendages. This emphatically proposes that snakes advanced from physical reptiles, and not from water-abiding species, as had been thought some time recently.

"The marine speculation is dead," senior creator Nicholas Longrich of the University of Bath told Discovery News. "It's really been really dead for some time now, however this is truly beating the nails in the box. Oceanic snakes developed from physical snakes - numerous, multiple occassions."

As this picture shows, Tetrapodophis otherwise known as "Four Feet" was a meat-eating predator. It lived in what is currently the Crato Formation of Ceará, Brazil, somewhere around 146 and 100 million years prior.

In the event that Four Feet could be breathed life into back today, "You would be confounded, on the grounds that you would be feeling that this resembles a snake...but it's odd; it shouldn't have feet," lead creator David Martill of the University of Portsmouth told Discovery News.

He, Longrich, and co-creator Helmut Tischlinger accept that the bizarre reptile and its kinfolk advanced ever-littler appendages after their ancestors experienced an underground stage. Amid this time of the Early Cretaceous, the creatures tunneled underground.

"Appendages act as a burden on the off chance that you are tunneling through delicate sand," Martill clarified. "Vastly improved to "swim" through leaf litter or sand. As legs got littler, "swimming" turned out to be more effective."

The researchers further suspect that these undulating developments were pre-adjustments to genuine swimming in water.

Four Feet's front appendages were small to the point that Martill portrayed them as being "despicable" and "little."

While miniscule, the feet appeared to be particular, as they were more extensive than those of reptiles. Therefore, the analysts think the feet helped the snake to seize prey and fasten onto an accomplice when mating.

Four Feet's head was marginally pointed and slim, proposes its skull. With respect to its general appearance, "It looked, well, twisted," Longrich said.

"It had the long, slim, serpentine body; it would have had a forked tongue," he proceeded. "It had the expansive stomach sizes of a snake. This is extraordinary to winds, and amazingly the fossil really saves them." 

Monday, July 20, 2015

1969:Armstrong takes a stroll on the moon

At 10:56 p.m. EDT, American space traveler Neil Armstrong, 240,000 miles from Earth, talks these words to more than a billion individuals listening at home: "That is one little stride for man, one titan jump for humankind." Stepping off the lunar landing module Eagle, Armstrong turned into the first human to stroll on the surface of the moon.

The American push to send space explorers to the moon has its beginnings in a well known offer President John F. Kennedy made to an exceptional joint session of Congress on May 25, 1961: "I accept this country ought to confer itself to accomplishing the objective, before this decade is out, of finding a man on the moon and returning him securely to Earth." At the time, the United States was all the while trailing the Soviet Union in space advancements, and Cold War-time America respected Kennedy's strong proposition.

In 1966, following five years of work by a universal group of researchers and designers, the National Aeronautics and Space Administration (NASA) led the initially unmanned Apollo mission, testing the auxiliary honesty of the proposed dispatch vehicle and rocket mix. At that point, on January 27, 1967, catastrophe struck at Kennedy Space Center in Cape Canaveral, Florida, when a flame broke out amid a kept an eye on platform test of the Apollo shuttle and Saturn rocket. Three space explorers were executed in the flame.

Notwithstanding the setback, NASA and its a great many workers moved forward, and in October 1968, Apollo 7, the initially kept an eye on Apollo mission, circled Earth and effectively tried a large number of the modern frameworks expected to direct a moon adventure and landing. In December of that year, Apollo 8 took three space travelers to the dim side of the moon and back, and in March 1969 Apollo 9 tried the lunar module interestingly while in Earth circle. At that point in May, the three space travelers of Apollo 10 took the first finish Apollo shuttle around the moon in a dry keep running for the booked July landing mission.

At 9:32 a.m. on July 16, with the world viewing, Apollo 11 took off from Kennedy Space Center with space explorers Neil Armstrong, Edwin Aldrin Jr., and Michael Collins on board. Armstrong, a 38-year-old non military personnel examination pilot, was the leader of the mission. Subsequent to voyaging 240,000 miles in 76 hours, Apollo 11 went into a lunar circle on July 19. The following day, at 1:46 p.m., the lunar module Eagle, kept an eye on by Armstrong and Aldrin, isolated from the order module, where Collins remained. After two hours, the Eagle started its plummet to the lunar surface, and at 4:18 p.m. the art touched down on the southwestern edge of the Sea of Tranquility. Armstrong quickly radioed to Mission Control in Houston, Texas, a well known message: "The Eagle has landed."

At 10:39 p.m., five hours in front of the first timetable, Armstrong opened the trapdoor of the lunar module. As he advanced down the lunar module's step, a TV camera joined to the specialty recorded his advancement and radiated the sign back to Earth, where several millions observed in awesome reckoning. At 10:56 p.m., Armstrong talked his renowned quote, which he later fought was somewhat jumbled by his mouthpiece and intended to be "that is one little stride for a man, one titan jump for humankind." He then planted his left foot on the dim, fine surface, made a mindful stride forward, and mankind had strolled on the moon.

"Buzz" Aldrin went along with him on the moon's surface at 11:11 p.m., and together they took photos of the landscape, planted a U.S. banner, ran a couple of basic experimental tests, and talked with President Richard M. Nixon through Houston. By 1:11 a.m. on July 21, both space travelers were back in the lunar module and the trapdoor was shut. The two men dozed that night on the surface of the moon, and at 1:54 p.m. the Eagle started its rising back to the order module. Among the things left on the surface of the moon was a plaque that read: "Here men from the planet Earth first set foot on the moon–July 1969 A.D–We came in peace for all humankind."

At 5:35 p.m., Armstrong and Aldrin effectively docked and rejoined Collins, and at 12:56 a.m. on July 22 Apollo 11 started its adventure home, securely sprinkling down in the Pacific Ocean at 12:51 p.m. on July 24.

There would be five more effective lunar landing missions, and one spontaneous lunar swing-by, Apollo 13. The keep going men to stroll on the moon, space explorers Eugene Cernan and Harrison Schmitt of the Apollo 17 mission, left the lunar surface on December 14, 1972. The Apollo project was an immoderate and work escalated attempt, including an expected 400,000 architects, specialists, and researchers, and costing $24 billion (near to $100 billion in today's dollars). The cost was supported by Kennedy's 1961 command to beat the Soviets to the moon, and after the deed was refined continuous missions lost their practical