History Podcasts

How were 19th century American diplomats paid?

How were 19th century American diplomats paid?


We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

I just came across this line in H.W. Brands' Andrew Jackson biography, explaining why Jackson was initially uninterested in the presidency: "The pay was good, but the expenses of the office ate up the salary and more" (370).

Apparently, the president had no expense account for official duties until 1949. Presidents still pay for food, entertainment, and White House staff. Before Buchanan's presidency, presidents even had to pay their aides and private secretaries themselves. Presidential protection seems to have been a personal expense as well, with the president hiring bodyguards (e.g. William Crook). The Secret Service was created in 1865 but did not start protecting the president until the McKinley assassination. I assume that presidents paid for their own travel and lodging (which may be why Lincoln stayed at a private residence while visiting Gettysburg). This all explains the many impecunious ex-presidents.

This made me wonder how diplomats were paid. Diplomats also have significant expenses (international travel, rent, entertainment, often unfavorable exchange rates). The European solution was in part to rely on wealthy aristocrats for diplomatic service, but there were far fewer wealthy Americans in this period who could bear such an expense. So when the president would send some Americans over to negotiate a treaty, were these diplomats paid some kind of stipend? Could they submit an expense report and expect to be reimbursed? Were these arrangements regularized or ad hoc (i.e. subject to Congressional lawmaking and appropriations)? Or were diplomats just expected to be independently wealthy?

How shoestring was American diplomacy in the 19th century?


They were paid a regular salary and given an "expense account" of sorts. At least, the higher ranking representatives of the United States were. While this was probably not a very adequate amount, American ministers were definitely not expected to pay for everything out of their own pockets.

Early United States ambassadors were paid around $2,500, while consuls were unpaid appointments who receive remunerated from fees. That was of course completely inadequate, and the latter eventually resulted in abuses all around. As the federal government became better organised by the early 19th century, diplomatic salaries had been set to:

  • Ministers Plenipotentiary: $9,000
  • Ministers Resident: $6,000
  • Chargés d'affaires: $4,500
  • Secretaries of Legation: $2,000

(Source: Sparks, Jared, Francis Bowen, and George Partridge Sanger. The American Almanac and Repository of Useful Knowledge for the Year 1843. Vol. 31. Boston, David H. Williams, 1860)

Additionally, the ministers and chargés may receive an additional expense allowance ("outfit") of up to (at Presidential determination) one year's salary. It seems that a practice developed where half of their annual salary was received for outfit when a diplomat is first posted, and a quarter when they returned. This would presumably go towards the upfront expenses of moving, accommodations, etc.

These numbers did not change much for many subsequent decades. The same figures appeared in references up till the 1840s and 50s. At some point however they were raised such that, by 1872, the pay for top American diplomats were about:

  • Ministers Plenipotentiary: $17,5001 or $12,0002
  • Ministers Resident: $7,5003

Notes:

  1. To the United Kingdom, the German Empire, the French Republic
  2. To Austria, Mexico, Russia, Spain, Italy, China, Brazil
  3. With the exception of Liberia, $4,000

(Source: Turner, A.J., Legislative Manual of the State of Wisconsin, Madison, Wis.: Atwood & Culver, 1872)


While diplomats represented government interests, consuls were also appointed to countries to assist Americans locally. They remained unpaid by salary until the 1850s, when rules were introduced to curb the excess. But by 1886:

A very careful investigation on [cost of living] was made by the British consuls, by order of their government, in 1873. Notwithstanding this advance of prices, the scale of salaries of American consuls has scarcely been changed in these thirty years, when men are still sent to Florence and Naples, expected to be competent to perform all the duties of the office, hold a respectable position in society, and pay proper attention to the numerous Americans visiting those places, on the sum of $1,500 a year, scarcely more than is paid to the subordinate employees of Congress.

(… )

[I]t may be presumed that $2,500 is considered by Congress high salary. The experience of every one who has ever been in the consular service, or who has lived abroad in a private capacity, shows that in most cases this is utterly insufficient.

- Schuyler, Eugene. American Diplomacy and the Furtherance of Commerce. New York, 1886.

During this period, the pay were about:

  • Consuls-General: $2500 - $6000
  • Salaried Consuls: $1,500 - $4,000, and cannot engage in business
  • Consuls: $1,000, but may engage in business
  • Feed Consuls: paid by consulate fees up to $2,500, and may engage in business

: Except the Consuls to Liverpool and Hong Kong, who were paid $6,000 and $5,000, respectively.


That salaries were inadequate and fixed for long periods despite rising costs were a continuous issue. For example, in 1816 the American Ambassador to the Court of St James's, John Quincy Adams, wrote to then Secretary of State James Monroe that:

During the five years and a half of my establishment at St. Petersburg, my expenses fell little short of but did not exceed the salary and outfits allowed me. But I certainly could not disguise to myself… that it was impossible to proportion my establishment to that standard without a great sacrifice of that consideration which attends the character of a foreign minister.

I may state with perfect confidence that no minister of the United States at this court has ever found it profitable to limit his expenses within the public allowance of salary and outfit. And while it is notorious that a salary fixed twenty-five years ago was then inadequate to the necessities of the station, it is equally notorious that every expense fo a domestic establishment in this country has doubled in that interval.

- Adams, John Q. "To the Secretary of State." Letter to James Monroe. 12 July 1816. London.


Europe

Not directly related to the question, but European states did not rely on the personal wealth of diplomats, either. Or at least, not exclusively. In the same letter to Monroe, Adams explained that in addition to a base salary:

[T]he allowances to them for contingent expenses are usually an additional expense equal in amount to the salary. They are also entitled after a few years of service to pensions for life, proportioned to the length of time they have been in service, and equal upon the average from one-third to one-half of the annual salary, and they are permitted to receive presents from the governments to which they are accredited, which in these treaty making times form no inconsiderable part of their emoluments… The Russian and Austrian governments pay their ministers abroad much upon the same scale as France and England. The Russian ambassador at this court has a salary of sixty thousand dollars and a house to live in rent free.

- Adams, John Q. "To the Secretary of State." Letter to James Monroe. 12 July 1816. London.

Compared to the Russian ambassador to Britain, the President of the United States at the time received a paltry $25,000. While Adams was advocating raises for diplomats like himself, there's little doubt that European governments supported their representatives well. For example, Britain's ambassadors enjoyed expense accounts to the sizes of:

English ministers also have an outfit in order to enable them to install themselves properly at their posts. This is calculated on a liberal scale, being, for example, $20,000 for Paris; $12,500 for Vienna, Berlin, and St. Petersburg; $10,000, for China, Japan, and Persia, Madrid and Washington, and never being less than $2,000.

- Schuyler, Eugene. American Diplomacy and the Furtherance of Commerce. New York, 1886.


Diplomacy

Diplomacy is the practice of influencing the decisions and conduct of foreign governments or intergovernmental organisations through dialogue, negotiation, and other nonviolent means. [1] Diplomacy usually refers to international relations carried out through the intercession of professional diplomats with regard to a variety of issues and topics. [2]

Diplomacy is the main instrument of foreign policy and Global Governance which represents the broader goals and strategies that guide a state's interactions with the rest of the world. International treaties, agreements, alliances, and other manifestations of foreign policy are usually the result of diplomatic negotiations and processes. Diplomats may also help shape a state's foreign policy by advising government officials.

Modern diplomatic methods, practices, and principles originated largely from 17th century European custom. Beginning in the early 20th century, diplomacy became professionalized the 1961 Vienna Convention on Diplomatic Relations, ratified by most of the world's sovereign states, provides a framework for diplomatic procedures, methods, and conduct. Most diplomacy is now conducted by accredited officials, such as envoys and ambassadors, through a dedicated foreign affairs office. Diplomats operate through diplomatic missions, most commonly consulates and embassies, and rely on a number of support staff term diplomat is thus sometimes applied broadly to diplomatic and consular personnel and foreign ministry officials. [3]


Stories of Frontier Settlement Doctors

I n the late 19th and early 20th century, medicine in the settlements of the Pacific Northwest was often carried out far from a doctor’s office. In order to obtain treatment, settlers living on isolated farms, ranches and in mining or logging camps under­took a long and sometimes arduous journey. A minor injury or a common complaint could become an emergency or result in death solely because of the lack of proximity to medical help.

If a settlement was lucky enough to have a doctor living within a day’s journey, settlers often expected the doctor to come to them. Nonetheless, someone still had to be sent to notify the doctor that help was needed. Doctors traveled long distances on foot, on horseback, in wagons, buggies, ferries, canoes and boats. Traveling to a settlement might be a cross country journey on nothing more than an unmarked trail. The doctor’s bag was designed to carry the tools of the trade and withstand travel in all sorts of weather. Bags of durable oiled canvas or leather stood up to extended travel, whatever the season and terrain.

Rural doctors were general practitioners by necessity. They delivered babies, set broken limbs, pulled teeth, and tended to all sorts of wounds and diseases. They often created their own medications, as well as many of the instruments they used.

The rural family doctor was well known in the community and was often considered the most valuable asset in the area. They most probably delivered every child in the community, and sat with the dying as they drew their last breath. They saw people into and out of this world and in the meantime tried to keep them alive and healthy.

Rural settlers often had nothing to pay with except the fruits of their labor. Doctors would commonly be paid in cord wood, produce, meat, eggs, blankets or other items of value. The doctor was a family friend and might know more than anyone about any given person or family in the region. When a settlement doctor passed away, it was cause for great mourning.

Bethenia Owens-Adair , a pioneer physician, moved to Oregon from Missouri in 1843, and lived both in Astoria and Roseburg. She earned her medical degree from the Eclectic Medical College in Philadelphia in 1874 and in 1880 she finished her M.D. at the University of Michigan Medical School.

She returned to Oregon in 1881 and set up a successful medical practice in Portland. Later, she moved to Astoria where she continued to practice medicine and help with the family farm.

Dr. Owens-Adair described in her book, Bethenia Owens-Adair: Some of Her Life Experiences, what it was like on one of her home visits in Astoria: “I carried on my professional work as best I could in that out-of-the-way place and at no time did I ever refuse a call, day or night, rain or shine. I was often compelled to go on foot, through trails so overhung with dense undergrowth, and obstructed with logs and roots, that a horse and rider could not get past and through muddy and flooded tide-lands in gum boots.

“One day a Mr. William Larsen came, saying, ‘My wife is sick. Come at once.’ There was a most terrific southwest storm raging, and we had a mile to go on foot over the tide land before reaching the Lewis and Clark River. The land was flooded, the mud and slush deep, and the swollen sloughs had to be crossed on logs and planks. Nearly the whole distance was overgrown with enormous bunches of wire-grass, many being three feet across. This long, intertwined grass was a great obstruction to walking and I fell prone, again and again, before reaching the river. My boots were filled with water, and I was drenched to the skin. The wind was howling, and dead ahead. Mr. Larsen was a powerful man, and a master-hand with oars. He sprang into the boat throwing off his hat, two coats, and began to remove his outer shirt saying: ‘You must excuse me, Doctor, but if I ever get you there, I shall have to strip to the skin.’ I understood the situation, and knew that the odds were against us and I fully expected that, notwithstanding his uncommon strength and skill, we would be compelled to land far below our starting point on the opposite side.” Thinking that she would have to cross the land to the Larsen house, they were saved by a small launch that came out to meet them in the raging storm to carry them the rest of the way. By the time they arrived, the wife had survived the crisis point, and so had Dr. Owens-Adair.

“I had most of my practice in that section,” she wrote, “and made many trips to that neighborhood.”

Carl Julius Hoffmann entered UOMS in 1902 he was a dedicated young student who served as class president and graduated with the highest standing in his class. After an internship, he accepted an offer to take over the practice of a Dr. Longaker in Woodland, Washington. Within a week, he set up practice in the town with seven saloons and one church. Included in the deal he struck were a buggy and a team of horses, Trix and Pet.

Dr. Hoffmann bought a saddle horse to use when the roads became impassable. He had an office in the Bryant Building, where he practiced for 62 years. He lived next door to his upstairs office, carrying water and cutting his own for wood for heating and cooking. Earle Bryant, who became a fast friend, had a pharmacy on the first floor. In the absence of a nurse, Earle would serve as anesthetist. Many of the injuries sustained in the mills and logging camps required immediate onsite surgery. Bryant remembered that there were some very tense mo­ments in those days.

Hoffman’s fellow classmate and colleague, Dr. J. B. Blair of Vancouver, WA, wrote about Dr. Hoffman, “He has seen all of life’s panorama, from the cry of infancy to the parting sigh of old age. He has had no hour he could call his own. No room in his home has been exempt from the imperative call. The darker the night the more howling the storm, the more likely was he to be needed and aroused from slumber to go to the bed of suffering. He has borne all temperatures, sweating in Au­gust suns, freezing in December blasts. Drowned with the rains and choked by dust he has trudged here and there, hungry at noon, sleepy at midnight, while others, oblivious to care, were resting or being refreshed by food or sleep.

To accumulate worldly goods in not and has never been Dr. Hoffman’s objective. No other person in the community has or could have sacrificed so much to charity as he. He has done all the good he could to all the people he could in all the ways he could, with little regard for remuneration, or ever keeping foremost in his mind that which is just, that which is honest, that which is true. These precepts have governed his life.”

George Weirs King was born in 1852 [1845] in the state of New York. He was the ninth of the ten children of Revolutionary War Veteran Cyrus King of Vermont, and Louisa Duncan. He graduated from the medical department of the University of Michigan in 1877. He served as assistant surgeon at the university one year practiced in Kempton, Illinois for two and a half years, in New York for one year and then went to Chicago.

While in Chicago, Dr. King suffered a severe attack of pneumonia. After his recovery he was offered the position of surgeon at Marysville for the Montana Mining Company. He believed that the climate would prove beneficial to his health, so he accepted, arriving in 1883. He was committed to his position with the company, but he also maintained a general practice in Marysville and the surrounding country. In the 1880s and 90s it was a bustling town of three thousand residents, and was the center of gold mining in Montana.

Dr. King invented many appliances for his use in surgery and general practice. One was a device in which to set fractured limbs, and another was used to place injured men while raising them out of the mines. He took pride in surgery, and per­formed many difficult and important operations.

In 1892, he filed a patent with the U.S. Patent Office in Washington, D.C. for a fracture apparatus. “My invention is designed to obviate this difficulty by providing a portable apparatus by means of which the operator, without the aid of an assistant, can apply and maintain proper extension and counter-extension of, and support for, the broken limb, and, at the same time have free access to every part of it for the convenient application of the splints or bandages,” he wrote.

Dr. King served his patients faithfully until his death in 1929. His scrapbook is filled with vivid images of his inventions and the patients he treated from a child who swallowed a penny whistle, men who suffered various injuries in the mines, club feet to necrosis of the skull .

James W. Robinson and Ella Ford Robinson

Ella Ford was born in about 1857 to Colonel Nathaniel Ford, who had settled in Rickreall in Polk County in 1844. Along with her younger sister, Angela, they were the first women to gradu­ate from the Willamette University Medical Department in Salem. Ella married fellow medical student James W. Robinson (1850-1938). When James graduated in 1878, the couple moved to Jacksonville, Oregon and opened a drugstore associated with his practice. Dr. Ford Robinson opened her own office, becoming the first woman physician to practice in southern Oregon. A notice in the Ashland Tidings read, “Dr. Ella Ford Robinson–diseases of women, a specialty. Office and residence at Judge Duncan’s, Jacksonville.” She barely had an opportunity to practice in Jacksonville she died in childbirth not a year later. But James Robinson stayed in Jacksonville where he carried on a busy practice.

Jacksonville became the cultural and commercial center of the region after gold was found in 1851 but conditions continued to be difficult for doctors and patients. In his memoirs, Dr. Robinson remembered that there were no phones in the case of an illness or an injury, those in need of a physician would have to send a hired hand at the end of the work day for help. Many trips were fifty to 100 miles or more on bad roads through all sorts of weather.

Jacksonville was a pioneer town. Abigail Scott Duniway a leading force in the suffragist movement writes of visiting Jack­sonville: “I went to Southern Oregon in 1879, and while sojourning in Jacksonville was assailed with a show of eggs (since known in that section as "Jacksonville arguments") and was also burned in effigy on a principal street after the sun went down. Jacksonville is an old mining town, beautifully situated in the heart of the Southern Oregon mountains, and has no connection with the outside world except through the daily stagecoaches. Its would-be leading men are old miners or refu­gees from the bushwhacking district whence they were driven by the Civil War. The taint of slavery is yet upon them and the methods of border-ruffians are their heart's delight. This is where Dr. Robinson carried out his career until his death.

Dr. Robinson was known as the last of the Jacksonville pioneer doctors. When he arrived in this small village, he felt he had found his paradise. He carried on his practice in Jacksonville until his death having cared for families for miles in all direc­tions.

Mary Purvine began her life as the daughter of New England Quakers. With one brother dead and another practicing law, Mary learned to do the farm work as well as take care of the household chores. “Thin and straight”, she said she was raised to know only that “it was a sin to lie, and that the worst thing in the world was a drunkard.” When her mother’s arm was fractured by a fall, a woman doctor came to set the bones. Mary announced immediately that she would become a doctor. She entered the Willamette University Medical Department in Salem, Oregon and graduated in 1899, the only woman in a class of four men.

After graduation, she set up a practice in Condon, Oregon, a town of 800 inhabitants. She tells this story of great danger and bravery: A man from a distant ranch located on the other side of the John Day River drove ten miles to the nearest phone to tell the doctor of the imminent birth of a child. Traveling five hours by horse and wagon in a blinding snow, Dr. Purvine and her driver descended into 30 Mile Canyon where they came to the river, traversing it by ferry. The ferryman gave directions to the ranch, located on a nonexistent road. Arriving at the “one room and lean-to,” she found the woman had already given birth. All was well with the mother and baby, but there was the still the return trip to be made.

The ferryman was nowhere to be found when Dr. Purvine and her driver arrived at the river. A fourteen-year-old boy took them across, but landed with feet to go from the bank. Storm clouds were gathering, and more snow threatened. A swift slap of the reins sent the horses up the bank, leaving the wheels of the buggy in the river. After more tries, the wagon and horses lurched safely onto solid ground. The team needed to traverse an eleven-mile ascent before reaching the canyon. When they reached the flat, one of the horses gave out. The driver urged the horses on, finally finding the way to a house where they took an hour’s rest and some food. With little time left, they reached what they thought was the descent to the canyon, but they found only trackless snow. Using a lantern, the driver traipsed round and round until he finally found the way. Seven­teen hours later, they made their way gratefully back to Condon.

The parents named their girl baby Mary in her honor. “She was cross-eyed and had a mean disposition,” said Dr. Purvine, “and she wasn’t paid for until after I was married, when we had installments consisting of a bushel of tomatoes weekly. No wonder I don’t care for sliced tomatoes.”

Esther Pohl Lovejoy and Emil Pohl

Dr. Esther Pohl Lovejoy was born in 1869 and grew up in a logging camp near Seabeck, Washington Territory and later lived in Portland, Oregon. For a while, she could not decide between a career in theater or medicine but she eventually chose medicine saying it was “drama in its highest form.”

The woman doctor who delivered Lovejoy’s youngest sister became an inspiration for her to enter the University of Oregon's Medical School. Taking a year off to earn money, she finished in four years and graduated in 1894 with a medal for her strong academic achievement.

Esther married classmate Emil Pohl and followed him to the Gold Rush in Skagway, Alaska, where her brothers were suppli­ers to gold prospectors. They lived a log cabin with a fireplace for warmth and candles for light. As the new doctors in town, their services were constantly in demand. Dressed in fur against the cold, Dr. Pohl made the rounds of her patients by dog sled. If she needed to drive far from the town, she engaged a native boy who knew the country.

The Pohls started a hospital in an old mule shack. With the help of a renowned gambler who went by the name of Soapy, self-proclaimed “King of the Klondike”. Soapy invited the new doctors to speak to a crowd of bushy-haired men in Soapy’s saloon. After the speech, a hat was passed and over $3,000 in coins, dust and pledges was gathered. The Union Church of Skagway offered to sponsor the hospital and with their help, the old shack was restored, cleaned and painted. When it was finished it served only for surgery and serious cases. Most of the doctoring took place far from town.

On Christmas Day, Dr. Pohl’s brother Fred entered the town celebrations to demonstrate the usefulness of a bicycle in the snow. Two friends followed by dog-sled to make sure he could make it to the pass and back.

The three were never seen alive again they were murdered on the Skagway-Dawson Trail. The Mounties found the bodies of Fred’s friends first. His body was not found until the spring thaw buried in a shallow hole. Dr. Pohl left Alaska for good, saying to her husband, “I won’t come back. I just don’t have the heart for anymore Alaska.”

Dr. Emil Pohl stayed on continuing his practice in Skagway and the surrounding regions. While fighting an epidemic of encephalitis, he contracted the disease and succumbed. Esther returned once more to Alaska but only to retrieve her dead husband’s body.

Herbert Merton Greene , the oldest of nine children, was born in the Blue Ridge Mountains of North Carolina in 1878. It was there that a Mitchell County doctor, an army medic, inspired him to become a physician.

Greene moved out west and attended the University of Oregon Medical School, graduating in 1904. He completed post-graduate studies at the Vanderbilt Clinic in New York, and served internships at the Multnomah County Hospital, the Coffey Hospital and the North Pacific Sanitarium.

Greene's first year as a physician was spent making house calls by horse and buggy in LaCrosse, Washington. Located along the OR&N railway line, LaCrosse was a very small town of fewer than 500 inhabitants. He also worked as the proprietor's assistant in LaCrosse’s saloon and pharmacy.

After his experiences as a country doctor he decided that the rural life was not for him. He built a home on the basalt cliffs of the Willamette River and carried on a successful practice in Portland, Oregon until he passed in 1962.

Forbes Barclay was born in Scotland in 1812. He followed his father, John Barclay, into medicine. He studied at the University of Edinburgh, Scotland and received his diploma from the Royal College of Surgeons in London in 1838. He was appointed Surgeon to the Honorable Hudson’s Bay Company in 1839. Traveling around Cape Horn, he arrived at Fort Vancouver in 1840. Dr. Barclay was immediately put in charge of the hospital at the fort, which was described as an old shed outside the stockade. He also attended the settlers and the Indians in the region.

After 10 years, Barclay moved to the young settlement of Oregon City, a town established by the Hudson’s Bay Company. He had decided to “cast his lot with the Americans.” His practice ranged as far as “St. Helens on the Columbia, the Waldo Hills to the South, and Foster’s in the Cascade Mountains.” He traveled on horseback and by canoe navigated by an Indian guide. When he died at the age of 61 in 1873, the community mourned their loss. He was thought of as a “kindly, skillful and devoted physician among our midst.”

Developments in transportation and technology have made immeasurable improvements to rural medical care in the Pacific Northwest, but rural practice today still has similarities to that of the 19th century. As the OPB production of The Oregon Story: Country Doctors, Rural Medicine explains, “Like their predecessors, today's country doctors do things a little differently, but they also defy the stereotypes. They are some of the finest health care providers in the state and still tend to practice a sort of whole-person care rarely seen in urban medicine. Most studied medicine in a big city — probably even grew up in one — and have chosen a rural practice because they appreciate the lifestyle and community. In many cases, the doc might not be an actual "doctor," but a skilled professional with different credentials. And although most patients rely on insurance to cover health care costs, a rural doctor might still be persuaded to accept a payment of a cord of firewood or a side of beef.”

Oregon Health & Science University is dedicated to improving the health and quality of life for all Oregonians through excellence, innovation and leadership in health care, education and research.

© 2001-2020 Oregon Health & Science University. OHSU is an equal opportunity affirmative action institution.


8 Robert Jeffrey


Robert Jeffrey was a British seaman who was pressed into service on a Royal Navy sloop called the Recruit commanded by Captain Lake during the Napoleonic Wars. Impressment was a form of conscription practiced by the Royal Navy at the time, so naturally, Jeffrey was resistant to serving on the Recruit.

Jeffrey was caught stealing beer from the ship&rsquos store. As punishment, Captain Lake decided to maroon Jeffrey on a desert island without any supplies. When Lake&rsquos superiors found out what had happened, they ordered him to go back and retrieve the stranded seaman. Upon returning to the island, they found no trace of him and assumed he was dead. An inquiry was held, and Captain Lake was dismissed from his post.

Jeffrey, in fact, had managed to survive by eating limpets and drinking rainwater. After nine days and some unsuccessful attempts to flag down passing ships, Jeffrey was rescued by an American ship. The ship took him back to Massachusetts, where he lived for a number of years. Only later did the British Government find out that Jeffrey was still alive.


Salary in Victorian era

Milk-woman’s wage – 9s a week

Dentist’s charge for 2 fillings – 10s 6d

Top wage of a woman operating a sewing machine – 16s

Average coffee-stall keeper, general labourer or female copy clerk in the City salary was – £1 per week

Min. cost of a funeral – £4

Live-in maid’s earnings were £6 a year. General servant – £16 annually.

A full set of false teeth cost £21

A butler – £42 per annum while Post Office clerk – £90 a year.

Anglican parson – £140 a year The Governor of the Bank of England – £400 p.a.

1. According to Porter Porter, Dale H. The Thames Embankment: Environment, Technology, and Society in Victorian London , in the mid-1860s workers in London received below wages for a 10-hour day and six-day week:

  • common laborers 3s. 9d.
  • excavators wearing their own “long water boots” 4s. 6d.
  • bricklayers, carpenters, masons, smiths 6s. 6d.
  • engineers 7/6 (= £110 pounds/year)

2. These wages reflect weekly pay in the mid- to late ’60s (various sources listed below)

  • Mail Coach Guard … 10/0 + tips
  • Female telegraph clerk … 8/0
  • London artisans … 36/0
  • London laborers … 20/0
  • Farm hands … 14/0
  • Sailors … 15/0
  • Seaman on steamers … 16/4

3. In better earning professions, salaries were mentioned in annual amounts.

A box in The Royal Opera House – £8,000 Lord Derby’s income was £150,000. Duke of Westminster’s annual income topped them all at a cool £250,000.

There is more information available here: Victorian Black People Jobs. For Salary info, refer Bowley, A. L., Wages in the United Kingdom in the 19th Century. Cambridge: University Press, 1900. Burnett, John, A History of the Cost of Living. Harmondsworth: Penguin, 1969. Hayward, Arthur, The Days of Dickens. New York: E. P. Dutton & Company, 1926.


American Gravestone Evolution - Part 1

Having preserved monuments for the last 15 years in the country’s oldest cemeteries, I have been intrigued by the history I uncover when taking the time to look for clues in the landscape around me, and in the stones themselves. A historical graveyard, and all that goes into it, is a kind of ancient puzzle, that I hope will intrigue you as it does me.

By understanding the craftsmanship of early stone carvers and tracing the sources for the different kinds of stone they used, I gain appreciation for the resourcefulness of the people of the time. Noticing the progression of fads and styles in the religious and iconic symbolism of gravestones as I work on them puts another piece of the puzzle together, giving indications about the fears and hopes of our colonial ancestors. Taking time to appreciate the aesthetic thinking that went into the early ‘planned’ cemeteries, and recognizing the ingenuity of these early ‘landscape architects’ has provided a source of endless fascination for me.

By looking at cemeteries with a mason’s eye, an artist’s heart and a historian’s curiosity, I have gained admiration for the artistry of our colonial ancestors, and an appreciation for the challenges they faced and the vision they possessed to create these living outdoor museums. It would be impossible to relate all that I have learned, but I will attempt to give an overview of the historical progression of graveyards in Colonial and Victorian America and point the reader to particular examples of places to visit, that hold our history not only in story, but in stone.

Historic graveyards can be found in nearly every part of America. They range greatly in size, shape, and style, depending on the region, landscape, and religious influences upon which they were built. Burying grounds were an important aspect of nearly every colonial American town, and were most often located directly adjacent to a church, meeting house, or beside the town’s green.

In the earliest colonial period, the local landscape and availability of land had a great influence on the exact location of graveyards. Sometimes hilly terrain was selected for the site of a burying ground, as the land was difficult to farm or build on. Rocky locations were traditionally known to be difficult to farm, much to the future grave- diggers dismay.

Unfortunately, there are very few existing original records remaining from the very early colonial era, so a great deal must be determined, based on what remains, the gravestones themselves.

Early Colonial graveyards tended to be used, or filled, in the order of need, not sold in lots to families. Gravediggers may have purposely left spaces for relicts or consorts, for the spouse who was still living, but the earliest graveyards show evidence of people making due with what was available. Little planning, but much care, went into these burying places, which indicates a more rustic, less moneyed population.

The very earliest European settlers had no professional stone workers to hire when their loved ones died. They would either create simple wooden markers or wooden crosses to mark recent burials. Often times field stones and crudely carved small rocks were employed, sometimes with names or initials scratched in. By the middle to the 1600s skilled stone workers began to migrate to America, bringing more artistry and craftsmanship, but using the materials at hand.

During the colonial era gravestones tended to be of a smaller size, and most often created from softer types of stone such as sandstone and slate, which were easier to quarry, cut and carve. The primary style of gravestone was called a tablet stone, meaning a single piece of cut stone, placed vertically and upright. An average tablet stone had about one third of its mass underground.

Boston was the epicenter of gravestone carving in colonial America, and a place where the trade and skills had been directly imported from overseas. In most parts of America, throughout the 17 th and 18 th centuries, gravestone carving was not a full-time occupation, as the work was too sporadic for a carver to fully earn a living. Many of the early carvers worked part -time, and may have also have worked as a masons, carpenters, or farmers.

By the late 1600’s in Boston however, the population became large enough to support a few full-time gravestone carvers. The Boston area also held a wealth of extremely high quality slate stone, which was both easy to carve and very durable to weathering. Due to the size and the population of Boston, and the quality of the stone, Boston slate colonial tablet stones were carved in large numbers and shipped to distant locations along the entire eastern seaboard. I have personally observed Boston slate gravestones as far away as Charleston, SC, and Savannah Georgia.

The colonial gravestones’ shapes, imagery, and symbolism were at first transported from much older influences in Europe. But once in America, they quickly adopted many varied and regional styles. By the middle of the 1700s.

depending on the religious influences, the materials available, and the gravestone carver’s own background, the once simple stoic stone inscriptions flourished into elaborate, ornately shaped and carved headstones.

By the early 1700s, Newport Rhode Island had two of its own full time gravestone carvers, John Bull and John Stevens. Founded in 1705, John Steven’s business became an important influence on other carvers as his stones were created in large numbers and installed in the wide area. Although now run by another family, NAME of Company -- is known today as the oldest continually operational gravestone shop in America.

The Newport slate however, is not quite as enduring as the Boston slates and often eroded over time, causing the carvings and inscription to become faded, sometimes to the point of being very difficult to read. As Boston and Rhode Island exported headstones to other parts of colonial America, other regions remained more local, and relied upon their own resources to honor their departed family members.

Before the railroads connected the New England towns it was very difficult and time consuming to move stone, which weighs around 150 pounds per cubic foot, so many other kinds of stones were employed regionally during the colonial era. With the exception of the Boston slates, which were shipped down the coast by ship, most gravestone carvers worked with whatever stone material was locally available. Stones needed to be soft enough to split and carve with hand tools, but durable enough to resist erosion.

Connecticut, my home state, has it’s own long colonial history, that can be traced through gravestone study. There was almost no slate at all for gravestones in Connecticut, however a huge amount of sandstone was available for use. Sandstones are formed when bodies of fresh water dry up, and the sand grains are mixed with varying minerals to become cemented together into a matrix. If there is enough pressure underground, and after a long length of geological time, this sand mix will become stone. The higher the clay content, the weaker and less durable the stone. The higher the silicate content, the stronger and more durable to the stone will be.

In Connecticut there are abundant sandstone veins running from the shore in the south, to Long Meadow, MA, and beyond in the north. The farther north, the better the quality of the stone, and the more clear and enduring the carving are today.

The earliest gravestone carver in Connecticut was George Griswold, from Windsor, CT. He likely received training overseas, but arrived in Winsor in the middle of the 1600s, already an expert stone carver. The sandstone he worked is know as brownstone, a slang term to describe a sandstone tending to be brown in color.

The Windsor area sandstone he worked was a brown- red color, with a very fine grain, and was relatively high in silicates. It tends to weather minimally and Griswold’s concise lettering on stones dating back to as early as the middle to late 1600s can be easily read today, on nearly all his stones. These stones still stand today in Windsor, CT, at the oldest part of the Palisado Cemetery, in the historic district.

A great stone for study, the oldest inscribed readable gravestone in America carved by Grisold, is one I have helped to preserve. It is the box crypt tomb in Palisado cemetery. Although it may have been backdated, carved at a slightly later date, it clearly reads ‘Rev. Ephraim Huit, who died 1644’. Intriguingly, there are 2 faces of inscription on the tomb, the opposite side being carved much later in the 1800s. Griswold’s expertise is evidenced by the fact that the more recent carving is more eroded then the original stone face on the southern side.

About 15 miles to the south of Hartford lay Middletown, CT. Today the town is known as Portland, but in Colonial times was part of Middletown, then called East Middletown, due to it location, just east of the Connecticut River where the sandstones cliffs can still be seen today, lining the eastern edge. Sandstone was being quarried and carved into gravestones from this area since the 1600s. Two stone carving families, the Stancliff’s and the Johnson’s would continue to work this stone with increasing levels of craftsmanship throughout the 1700s.

Like the stones Griswold used, the other early stones from this region tend to weather less and are much more durable than many of the more recent brownstones quarried. In the middle the late 1800s the Portland quarries were said to have become the largest sandstone quarrying operations in the world, shipping the stone by boat and train all over America. The famous brownstone buildings in New York City were created from this stone.

In Eastern Connecticut the material of choice was a type of stone called schist. Found in large amounts in Bolton, East Hartford, and Norwich it is very rare as a gravestone material in most other regions. Schist is a foliated metamorphic rock that is composed largely of mica minerals. Although some schist gravestones erode and lose their carved details and inscription fairly quickly, others from as early as the middle of the 1700s still hold concise carved details and are easily read today.

In Wethersfield, VT large quantities of a relatively rare gravestone material was used, due to a very active quarrying operation in soapstone. Being soft enough to scratch with your fingernail, common wisdom would dictate that soapstone would weather quickly when placed outdoors.

Though it is a very soft material, composed largely from talc (which can be made into baby powder), it is also very high in silicates, which gives it great resistance to acids, such as acid rain. Many of these soapstone markers are still in nearly perfect condition today, with clear and easily read inscriptions.

Although marble was to become the stone of choice during the Victorian era, I have encountered many early marble gravestones in upstate NY, some dating to before the revolutionary war. Clearly marble was worked in some regions, during the late colonial period. My investigation as to the source of this stone is ongoing, but I am becoming convinced that the stone used in the colonial times might have originated in Dorset Vermont, possibly the first marble quarry in America, which began operations in 1785. Though in some regions marble was being used in the late 1700s, it would soon surpass all the other types as the stone of choice for gravestones for much of the 1800s.

Marble is composed primarily from calcium carbonate. It is formed when limestone, a sedimentary rock composed of crushed sea- shells receives great amounts of heat and pressure underground for thousands of years. Due to its long formation process, marble is known as a metamorphic rock.

Very white marble is composed of almost pure calcium carbonate. This type was most often sought after to create gravestones with detailed carving and was indeed the ideal material to carve into sculpture. It became so popular that Carrera marble, for example, was imported from Italy for this purpose for use by wealthy patrons.

The biggest problem with marble however is its inability to resist acids, such as acid rain in a modern outdoor environment. ( Sum it up –) Ironically, though very expensive and sought after by affluent families, inscriptions on marble tombstones are today often faded into obscurity.

Early Cemeteries

By the early 1800’s many inner city church burying places were already becoming overcrowded. Urban sprawl had spread around the churches, and a lack of maintenance and care led to many complaints about vagrants, grave robbing and theft of funerary objects. Due to these factors and increasing health concerns, the rural cemetery was born.

“Cemeteries”, from a Greek word, means, “sleeping place”, were planned burial places, which were situated intentionally away from population centers, either on the outskirts of the city or in the adjacent suburbs. This allowed for planning, surveying and selling of family plots in advance of need. A planned, neatly arranged cemetery allowed for larger family monuments to be centered on a plot with many future burial spaces.

One of the earliest planned cemeteries in America is located in New Haven, Connecticut is today, called Grove Street Cemetery. By the late 1700s the historic burial ground on the New Haven Green was already becoming overcrowded, and many issues were raised about the need for new burial provisions. In 1797, the New Haven Burying Ground was incorporated, and would become known as The Grove Street Cemetery. It featured plots permanently owned by individual families, complete with ornamental plantings and even paved, named streets and avenues.

In the early 1800s the Center Church on the Green was surrounded by the old, historic graveyard. Later the church wanted to expand, so in order to make room for the new much larger church structure, they planned to move the entire graveyard, stones and human remains, to the newly founded New Haven Burying Ground. Not everyone was happy about moving the mortal remains of many of the founding families of New Haven, and an unusual compromise was reached. The new Center Church would be built directly above the oldest part of the graveyard.

Today this original graveyard can be found in the church’s basement. Known as the New Haven Crypt, it is open to the public during visiting hours. I have personally been involved with the ongoing preservation efforts at the Crypt, which has experienced many deterioration issues related to a high water table in the area.

The desire to move, or remove, historic burying places was not limited to New Haven, and was in fact a very wide spread practice in American urban areas throughout the 1800s.

The Granary in downtown Boston, which holds 5 signers of the declaration of independence, had even been given a street number in advance of real estate developers attempting to move the entire graveyard in the middle 1800a to the newly formed Mount Auburn Cemetery just across the Charles River in Cambridge.

Luckily this reckless idea was not adopted as the American historic preservation movement had begun, fighting and battling to protect many landmarks from the wrecking ball at the eleventh hour on many future occasions.

Mount Auburn is one of the earliest examples of the planned rural cemetery movement. This kind of cemetery would incorporate scenic winding roads with planned landscaping, ponds, fountains and rare trees. Within the next few decades, nearly every city in America would follow suit. These planned cemeteries allowed for pre-need lots sales, which also facilitated larger family monuments. Technological advances in machinery, quarrying, cutting and the manufacturing of stone, also set the stage for larger, more ornate and complicated monumental installations that became the standard for the wealthy classes of the period throughout the United States.


Escape from slavery, life in New Bedford, and work with the American Anti-Slavery Society

Douglass moved about Baltimore with few restrictions, but that privilege came to an end when he decided to attend a religious meeting outside of Baltimore on a Saturday evening and postpone paying Auld his weekly fee. The following Monday, when Douglass returned, Auld threatened him. After that encounter, Douglass was determined to escape his bondage. He escaped in September 1838 by dressing as a sailor and traveling from Baltimore to Wilmington, Delaware, by train, then on to Philadelphia by steamboat, and from there to New York City by train. Black sailors in the 19th century traveled with documents granting them protection under the American flag. Douglass used such documents to secure his passage north with the help of Anna, who, according to family lore, had sold her feather bed to help finance his passage.

New York City was a dangerous place for enslaved people seeking freedom. Numerous slave catchers traveled to the city to track down those who had escaped. Many locals, Black and white, were willing, for money, to tell the authorities about people trying to escape enslavement. For his own protection, Douglass (still months from assuming that name) changed his name from Frederick Bailey to Frederick Johnson. A chance meeting with Black abolitionist David Ruggles led Douglass to safety. Anna arrived in New York several days later, and the two were married by the Reverend J.W.C. Pennington.

At Ruggles’s recommendation, the couple quickly left New York City for New Bedford, Massachusetts. Ruggles had determined that New Bedford’s shipping industry would offer Douglass the best chance to find work as a ship caulker. In New Bedford the couple stayed with a local Black married couple, Nathan and Polly Johnson. Because many families in New Bedford had the surname Johnson, Douglass chose to change his name again. Nathan Johnson suggested the name Douglass, which was inspired by the name of an exiled nobleman in Sir Walter Scott’s poem The Lady of the Lake. The newly minted Frederick Douglass earned money for the first time as a free man. However, despite Douglass’s previous work experience, racial prejudice in New Bedford prevented him from working as a ship caulker (white caulkers refused to work with Black caulkers). Consequently, Douglass spent his first years in Massachusetts working as a common labourer.

Douglass remained an avid reader throughout his adult life. When he escaped to New York, he carried with him a copy of The Columbian Orator. In New Bedford he discovered William Lloyd Garrison’s abolitionist newspaper, The Liberator. Inspired by it, Douglass attended a Massachusetts Anti-Slavery Society convention in Nantucket in the summer of 1841. At the meeting, abolitionist William C. Coffin, having heard Douglass speak in New Bedford, invited him to address the general body. Douglass’s extemporaneous speech was lauded by the audience, and he was recruited as an agent for the group.

As an agent of both the Massachusetts Anti-Slavery Society and the American Anti-Slavery Society, Douglass traveled the country promoting abolition and the organizations’ agenda. He and other persons who had escaped conditions of enslavement frequently described their own experiences under those conditions. The American Anti-Slavery Society supported “moral suasion” abolition, the belief that slavery was a moral wrong that should be resisted through nonviolent means. Douglass strongly promoted this philosophy during the early years of his abolitionist career. In his speech at the 1843 National Convention of Colored Citizens in Buffalo, New York, Black abolitionist and minister Henry Highland Garnet proposed a resolution that called for enslaved people to rise up against their masters. The controversial resolution ignited a tense debate at the convention, with Douglass rising in firm opposition. His belief in moral suasion would repeatedly place him at odds with other Black abolitionists during this phase of his career. Work as an agent provided Douglass with the means to support his family. He and Anna had five children: Rosetta (born 1839), Lewis (born 1840), Frederick, Jr. (born 1842), Charles (born 1844), and Annie (born 1849).


The History of Mother Jones

Mary Harris Jones, this magazine’s namesake, crafted a persona that made her a legend among working people. So why is so little about her remembered today?

By Elliot J. Gorn

Upton Sinclair knew Mother Jones. The author of the best-selling exposé of the meatpacking industry, The Jungle, even made her a character in one of his novels, a lightly fictionalized work called The Coal War, which chronicled the bloody Colorado coal strike of 1913-14: “There broke out a storm of applause which swelled into a tumult as a little woman came forward on the platform. She was wrinkled and old, dressed in black, looking like somebody’s grandmother she was, in truth, the grandmother of hundreds of thousands of miners.”

Stories, Sinclair wrote, were Mother Jones’ weapons, stories “about strikes she had led and speeches she had made about interviews with presidents and governors and captains of industry about jails and convict camps.” She berated the miners for their cowardice, telling them if they were afraid to fight, then she would continue on alone. “All over the country she had roamed,” Sinclair concluded, “and wherever she went, the flame of protest had leaped up in the hearts of men her story was a veritable Odyssey of revolt.”

When Sinclair wrote these words, Mother Jones was one of the most famous women in America. Articles about her regularly appeared in magazines and newspapers, and for many working Americans, she had achieved legendary, even iconic, status. Yet the woman for whom Mother Jones magazine is named is scarcely known any longer. Some might recognize her name, know something about her activism on behalf of working people, or even recall her famous war cry: “Pray for the dead, and fight like hell for the living.” But few remember much about Mother Jones, who battled corporate presidents and politicians, who went to jail repeatedly for organizing workers, and who converted tens of thousands of Americans to the labor movement and the left.

As I worked on a recent biography of Mother Jones, however, I came to appreciate her significance for our own times. With dramatic speeches and street theater, she organized workers, women, and minorities, drawing public attention to their hardships and giving them a voice. Mary Jones’ greatest achievement may have been creating the persona of Mother Jones. She was born Mary Harris in Cork, Ireland, in 1837. When she was barely 10 years old, she witnessed the horrors of the potato famine, which drove her family from their homeland to Toronto, Canada. Her parents established a stable, working-class household, and young Mary learned the skills of dressmaking, and also trained to be a teacher, a high ambition for an Irish immigrant woman of her day.

Wanderlust struck her in early adulthood–she taught for a few months in Monroe, Michigan, then moved on to Chicago, and another few months later to Memphis, Tennessee. There, on the eve of the Civil War, she met and married George Jones, a skilled foundry worker and a member of the International Iron Molders Union. They had four children together. In 1867 a yellow fever epidemic struck Memphis, killing George and their four children. Now a 30-year-old widow, Jones returned to Chicago and dressmaking, where her tiny shop was burned out in the great fire of 1871. For the next quarter century, she worked in obscurity. As the new 20th century approached, Mary Jones was an aging, poor, widowed Irish immigrant, nearly as dispossessed as an American could be. She had survived plague, famine, and fire, only to confront a lonely old age.

But then she invented Mother Jones. Or, to put it more precisely, she began to play a role that she and her followers made up as they went along. By 1900, no one called her Mary, but always Mother she wore antique black dresses in public, and she began exaggerating her age.

The new role freed Mary Jones. Most American women of that era led quiet, homebound lives devoted to their families. Women, especially elderly ones, were not supposed to have opinions if they had them, they were not to voice them publicly–and certainly not in the fiery tones of a street orator.

Yet by casting herself as the mother of downtrodden people everywhere, Mary Jones went where she pleased, spoke out on the great issues of her day, and did so with sharp irreverence (she referred to John D. Rockefeller as “Oily John” and Governor William Glasscock of West Virginia as “Crystal Peter”). Paradoxically, by embracing the very role of family matriarch that restricted most women, Mother Jones shattered the limits that confined her.

For a quarter of a century, she roamed America, the Johnny Appleseed of activists. She literally had no permanent residence. “My address is like my shoes,” she told a congressional committee. “It travels with me wherever I go.” She was paid a stipend by the United Mine Workers and, for a few years, by the Socialist Party. But she always felt free to work in whatever cause most needed her–striking garment workers in Chicago, bottle washers in Milwaukee breweries, Pittsburgh steelworkers, El Paso streetcar operators, Calumet copper miners. She helped workers fight not just low pay, 12-hour days, and horrifying mortality rates, but also the servitude of company stores and company housing. She also spoke out in defense of IWW leaders on trial for murder in Boise (she was one of the original signers of the Industrial Workers of the World charter), labor activists imprisoned in California, and Mexican revolutionaries in Arizona.

Mother Jones lost as many battles as she won, but still she got results. She was by far the most famous and charismatic organizer for the United Mine Workers. When she began working for that fledgling union in the 1890s, it had 10,000 members within a few years, 300,000 men had joined, and she organized many of their wives into “mop and broom” brigades, militant women who fought alongside their husbands.

The moniker “Mother” Jones was no mere rhetorical device. At the core of her beliefs was the idea that justice for working people depended on strong families, and strong families required decent working conditions. In 1903, after she was already nationally known from bitter mine wars in Pennsylvania and West Virginia, she organized her famous “march of the mill children” from Philadelphia to President Theodore Roosevelt’s summer home on Long Island. Every day, she and a few dozen children–boys and girls, some 12 and 14 years old, some crippled by the machinery of the textile mills–walked to a new town, and at night they staged rallies with music, skits, and speeches, drawing thousands of citizens. Federal laws against child labor would not come for decades, but for two months that summer, Mother Jones, with her street theater and speeches, made the issue front-page news.

The rock of Mother Jones’ faith was her conviction that working Americans acting together must free themselves from poverty and powerlessness. She believed in the need for citizens of a democracy to participate in public affairs. Working families, Mother Jones argued, possessed vast, untapped powers to fight the corporations that bound them to starvation wages and the corrupt politicians who did the businessmen’s bidding. But only strong, democratic organizations of citizen-activists, she felt, could achieve real egalitarian change. So, as we reclaim the memory of this great American, what was her legacy for the 21st century? Certainly some of her impassioned rhetoric would seem overheated in the cool medium of television. And in a world where oratory is a lost art, her speeches today might come across as over-blown and strident, even to many progressives.

Her agenda was also limited, even by the standards of her time. Mother Jones opposed giving the vote to women–or, to be more precise, she believed that suffrage was a false issue, a bourgeois diversion from the real problem of worker exploitation. She argued that only powerful organizations of workers–industrial unions–could bring justice. And while she helped organize women in various trades, she believed that working-class women were better off in the home than having their labor exploited.

In a sense, Mother Jones’ greatest strength was also her fundamental weakness: She saw the world primarily through the lens of class. Her single-mindedness sometimes blinded her to the unique issues facing women and minorities. Yet such myopia might help bring a little clarity to our own times. She offers a vivid reminder of what remains among the most underacknowledged issues of our day: that America is a class-riven society, where the wealthy have grown obscenely rich as working people have fallen further behind.

Here, Mother Jones’ voice would have risen loud and clear. Her memory evokes the great American tradition of protest. It reminds us that passion still matters, and that a well-crafted symbol can offer inspiration, emboldening us in a world where the possibility of meaningful change sometimes seems beyond our reach.

Mother Jones’ founders envisioned a magazine devoted to a new brand of socially conscious journalism — one that took on corporate as well as political power. Twenty-five years later, that mission remains as timely as ever.

By Adam Hochschild
May/June 2001 issue

When the first issue of Mother Jones arrived back from the printer 25 years ago, the 17 of us then on the magazine’s staff eagerly clustered around to rip open the boxes and touch and feel the printed pages at last. We were then working in cramped quarters above a San Francisco McDonald’s, and the smell of frying burgers drifted up from below. We would have been amazed to know that the magazine would still be here, some 200 issues and several offices later. Multinationals like McDonald’s endure forever, it seems, while dissenting magazines flare up, attract a little attention, and then die. While copies of Mother Jones may not blanket the world today quite as thoroughly as do Big Macs, more than 165,000 households will receive the issue you are reading, and the magazine’s Web site logs 1.25 million page views each month.

None of us here a quarter century ago could have dreamed of the World Wide Web in fact, for the first few years the magazine was even set in hot type, a 19th-century technology using molten lead. Look at an early issue of Mother Jones under a magnifying glass and you’ll notice the subtly irregular pits and flecks in the letters. Printing purists feel about hot type the way rail buffs feel about steam engines. But despite changes in how the magazine is produced, the causes it covers and its passion for justice are very much the same.

Mother Jones was born in a time of upheaval. It was early 1974 when several of us first met in the San Francisco living room of the late journalist and activist Paul Jacobs to begin planning the magazine. We were still living in the afterglow of the 1960s, when the civil rights and antiwar movements had put hundreds of thousands of Americans into the streets, shaken the country to its core, brought an end to legal segregation, and helped force U.S. withdrawal from the bloody, unjust war in Vietnam.

Although these crusades were fragmented or spent by the early 󈨊s, it was still a heady time politically. The movements for environmental protection and for women’s rights had just been born, or, more correctly speaking, reborn. The language of progressive politics had deepened. People who dreamed of a more just society now began to understand that the personal was also political, and that politics also included the health of our fragile and much-abused planet. In a sense, it seemed as if the 󈨀s were still going on, with new strains of activism in the air and new political earthquakes to come. We were, perhaps, a bit too naive about the remarkable staying power of the American political and corporate system.

Something else was in the air in 1974. Two enterprising young Washington Post reporters had uncovered the Watergate scandal when Richard Nixon resigned in August of that year, investigative journalism had changed the course of history. For anyone who believed in the power of the printed word, it was an exhilarating moment. And in the late 󈨀s and early 󈨊s, cities throughout the country were giving birth to alternative newspapers, many with a strong progressive bent. It was among reporters for this new generation of weeklies that Mother Jones found many of its best writers.

Up until that time, American investigative journalists had traditionally targeted politicians. We thought the country was ready for a magazine of investigative reporting that would focus on the great unelected power wielders of our time — multinational corporations. And we wanted that reporting to carry far. That meant it had to be a magazine that was well written: For our very first issue, Jeffrey Klein, one of the editors, found a piece by Li-li Ch’en that ended up winning a National Magazine Award. It also meant a magazine that would attract the eye: Louise Kollenbaum, our art director, designed a publication that would be a home for first-rate photographs and artwork. And finally it meant a magazine with the careful business planning necessary to take us well beyond the relatively small readership of the older left-leaning periodicals. Richard Parker, who worked as both editor and publisher, saw to it that Mother Jones took the best of what could be learned from the world of commercial publishing. Two of the talented young writers who first appeared in Mother Jones during the 1970s, Doug Foster and Deirdre English, each later went on to spend more than five years as the magazine’s top editor.

Once launched, the magazine took about a year and a half to fully hit its stride. It was clear when that happened, in the late summer of 1977. Mark Dowie was business manager of Mother Jones. In his spare time, he had written and published one piece in the magazine. One day an insurance investigator he knew asked him, “Have you heard about the Ford Pinto?” The Pinto, then the best-selling subcompact car in America, had a reputation for bursting into flames when rear-ended at low speeds. Dowie’s investigation yielded an extraordinary tale. Not only had Pinto crashes killed at least 500 people and painfully injured many more, but even before the first Pintos came off the assembly line, company engineers had warned management that the gas tank was dangerously close to the rear of the car. Ford executives then projected that it would cost them more money to shut down and retool their assembly line than to pay off the damage claims from the anticipated deaths and injuries. Dowie obtained the memo where they made these cost-benefit calculations.

Dowie’s story won many awards and got repeated by major newspapers, TV networks, and talk-radio programs. And that is how many of the magazine’s stories have had the greatest impact: by being picked up in the establishment media, which are usually too timid to launch Mother Jones-style investigations, despite their vastly greater resources.

The Pinto exposé was also the first time all of us at the magazine tasted the greatest pleasure of working at a place like this — hearing your enemies denounce you. Pressed by dozens of reporters for comment, Ford issued a statement claiming that Dowie’s story was all wrong, filled with “distortions and half-truths.” Several months later, racing to forestall a government safety hearing, Ford recalled 1.5 million Pintos for repairs.

Not long after this, we were paid a tribute of a different sort. It had never surprised us that Mother Jones annoyed repressive governments — our writers had had copies of the magazine confiscated from their baggage at Soviet airports and at Checkpoint Charlie in Berlin, and had been barked at by government officials and U.S. diplomats in places like El Salvador. But after a number of our stories irked authorities in Washington, the Internal Revenue Service launched an investigation into the magazine’s nonprofit status. And once the Reagan administration came into office, the probe took a harsh turn. The IRS claimed that even though Mother Jones lost money every year, it should pay taxes on the income it received from sources like advertising. This vendetta was so absurd that many mainstream newspapers ran editorials in our defense. The IRS finally dropped the case, but not until it had cost us huge legal bills.

Dozens more corporate exposés followed the Pinto story. In 1979, a team of writers put together a prize-winning package of stories on “dumping” — the unloading on Third World countries of pesticides, medicines, and other products banned in the United States as unsafe. The stories’ impact rippled throughout the world, and lawmakers in three countries introduced bills outlawing dumping. No one used the word globalization in those days, but you can’t cover U.S. corporate malfeasance without following the story abroad. Today that is more true than ever.

Mother Jones has also remained a strong voice for social justice: Racial discrimination, women’s rights, environmental justice, and the plight of immigrant farmworkers are all issues you will find covered in the magazine from its first year of publication to the present. Another major theme over the years — from investigations of costly, useless weapons programs in the Carter and Reagan military budgets to the U.S. Arms Trade Atlas on today’s Mother Jones Web site — has been the bloated American military budget and the way the United States uses its superpower influence overseas.

Although the magazine’s values have remained constant over the last quarter century, the world it exists in has changed enormously. The gap between rich and poor has grown wider — worldwide and in our home city of San Francisco, where the silicon boom has filled the streets with SUVs and has pushed rents far beyond what artists or the poor can afford. And while big money has always called the tune in American politics, the money has become bigger than ever and its influence ever more blatant. In 1996, the magazine launched the Mother Jones 400, an investigation of the largest donors to political campaigns. The latest MoJo 400, which appeared in the March/April issue, examined the business sectors that financed the campaign of George W. Bush — and what they expected in return.

American journalism has also changed markedly between 1976 and 2001. Twenty-five years ago an exposé that showed how a major corporation’s products injured people was certain to outrage readers we could be sure hundreds of them would write to their members of Congress, join a boycott campaign. But in the electronic age, people often feel they are drowning in information. The investigative journalist must meet a higher standard. He or she must not only provide crucial detail that cannot be found elsewhere, but must tell the story in such a way that readers cannot put the magazine down. And sometimes even that is not enough to force citizens or governments into action. Look at the long delay before Europe and the United States intervened, ever so reluctantly, in the former Yugoslavia — and didn’t intervene at all to stop the genocide in Rwanda.

Since our birth in 1976, control of the American mass media has become ever more centralized. When our friend Ben Bagdikian, former dean of the Graduate School of Journalism at Berkeley, published his 1983 book, The Media Monopoly, it was subtitled A Startling Report on the 50 Corporations That Control What America Sees, Hears and Reads. In each subsequent edition, Bagdikian jokes, he has had to reduce the number of corporations it is now down to six. All of this makes alternative, noncorporate news sources like Mother Jones more crucial than ever. One thing you can be sure of is that the magazine will never be part of AOL Time Warner.

Yet one of this country’s great paradoxes is that new forms of media monopoly and of free speech evolve at the very same time. If the 17 staff members who cheered the arrival of those first boxes of Mother Jones had gone to sleep like Rip van Winkle and then woken up today, one thing would leave us amazed and cautiously heartened: the Internet’s capacity to bring dissenting points of view to millions of people all over the world — and to enable those people to communicate with one another. Mother Jones was part of this process early on, in 1993, when it became the first general-interest magazine to publish on the Web.

So what can a Rip van Winkle of today expect in Mother Jones on its 50th anniversary? Perhaps by then both paper and computers will have been replaced by something we cannot even imagine. But technology is not what matters. One thing is certain: The world of 2026 will not have seen the end of injustice, of discrimination, of poverty, and of political and social violence. It will still have brave, determined men and women everywhere who will be fighting to change all of that. And Mother Jones will be on their side.


U.S. Tax System History in the 19th Century

Taxes that were used to raise money for the Wars were later repealed and would later be replaced by the income tax.

Americans often resisted taxation during the 19th century except during wartime. Once the wars were over, they insisted that Congress repeal the Acts that gave the Federal Government the right to levy taxes.

War of 1812 Taxation

When Thomas Jefferson became President in 1802, all direct taxes were abolished and during the following ten years no internal revenue taxes other than excises were in force. Until the War of 1812, this remained the only tax. But when money was needed for the War of 1812, Congress imposed additional excise taxes, raised some custom duties, and raised money by issuing Treasury notes. These were repealed by Congress in 1817 which prevented the Federal Government from collected the taxes for the next 44 years. The only way revenue could be raised were from high custom duties and through the sale of public lands.

Civil War Taxation

The Revenue Act of 1861 was passed by Congress to fund the Civil War. Excise taxes were restored and a tax on personal incomes was imposed. Income was taxed at 3 percent on all incomes higher than $800. Collections did not start until the following year. As the Civil War continued, it became obvious to Congress that the Union’s debt was growing at the rate of $2 million daily and additional revenue was needed. Congress passed a law on July 1, 1892 placing new excise taxes on items such as:

  • playing cards
  • gunpower
  • feathers
  • telegrams
  • iron
  • leather
  • pianos
  • yachts
  • billiard tables
  • drugs
  • patent medicines
  • whiskey
  • legal documents
  • license fees collected for almost all professions and trades.

Important features of the 1862 law were:

  • A two-tiered rate structure.
  • Taxable incomes up to $10,000 were taxed at 3 percent.
  • Income over $10,000 were taxes at 5 percent.
  • A standard deduction of $600 and a variety of deductions were permitted. These included rental houses, repairs, losses and other taxes paid.
  • Taxes were withheld by employers to make sure of payment.

After the end of the Civil War the need for Federal revenue declined and most taxes imposed during its duration were repealed. The main source of revenue were those coming from taxation of liquor and tobacco. In 1872, the income tax was abolished.

Spanish American War Taxation

When the flat rate income tax was imposed in 1895, it was immediately challanged. According to the Constitution, Congress could only impose direct taxes only if they were levied in proportion to each State’s population. In 1895 the U.S. Supreme Court ruled the flat tax unconstitutional because it was a direct tax and not apportioned to the population of each state. The Federal Government then began to rely heavily on high tariffs.

In 1899, the War Revenue Act was passed to raise money for the Spanish-American War through the sale of bonds, taxes on recreational facilities used by workers, and doubled the taxes on beer, tobacco, and chewing gum. The Act expired in 1902, leaving the Federal Government to look elsewhere to provide money to operate.


Terence Vincent Powderley

Hulton Archive / Getty Images

Terence Vincent Powderly rose from an impoverished childhood in Pennsylvania to become one of the most prominent labor leaders in late 19th-century America. Powderly became the head of the Knights of Labor in 1879, and in the 1880s he guided the union through a series of strikes.

His eventual move toward moderation distanced him from more radical union members, and Powderly's influence in the labor movement faded over time.

A complex individual, Powderly was also involved in politics as well as labor activities and was elected mayor of Scranton, Pennsylvania, in the late 1870s. After moving on from an active role in the Knights of Labor, he became a political activist for the Republican Party in the 1890s.

Powderly studied law and was admitted to the bar in 1894. He eventually took positions within the federal government as a civil servant. He served in the McKinley administration in the late 1890s and left the government during the administration of President Theodore Roosevelt.

When Powderly died in 1924, The New York Times noted that he was not well-remembered at the time, yet had been very familiar to the public in the 1880s and 1890s.



Comments:

  1. Stillman

    I believe that you have been deceived.

  2. Abhainn

    Devils harness

  3. Bara

    Absolutely



Write a message