Conservation of Cast & Wrought Iron

contents next

1 A Brief History

Iron, the fourth most abundant element in the earth’s crust, occurs widely in ores with a metal content up to 60%. Iron has a strong affinity for oxygen, but fortunately carbon has an even stronger attraction for oxygen. At high temperature carbon will therefore combine with oxygen in the ore, leaving the metal behind, but much heat energy is required.

Fe2 03

Iron Ore

+

3C

Carbon

> 2Fe

Iron

+ 3CO

Carbon Monoxide

Smelting of iron first developed around 2000 BC,--- the Iron Age. By heating iron-ore in the charred embers of a fire blown by bellows, it was discovered that iron ore could be reduced to a spongy metallic bloom and hammered to consolidate and purify it. This was further refined by reheating and hammering, becoming usable worked or wrought iron. The furnace, a bloomery, was a small bowl-shaped hole in the ground lined with clay and blown by manually-powered bellows, achieving temperatures around 700 degrees C.

Wrought iron is too soft to hold a good cutting edge, but around 1400 BC it was discovered that reheating blades in carbon (charcoal) produced a harder and tougher surface that could be sharpened. The carbon combined with the implement’s surface forming iron carbide, or steel. This steel surface could be heated and quenched in water to produce a hard edge.
Wrought iron production in bloomeries was small-scale and expensive, so in pre-industrial times it was used where its strength, hardness and malleability were essential, eg. in weapons, tools, security applications, (locks, window bars) wearing parts, (hinges, bearings, bell hangers and clappers, parts of machines such as pumps, wind/watermills, etc) fastenings (nails, rivets, collars, cramps,) and ornamentation. Because of its value it was also used as currency and jewellery, currency bars being bent round at their ends to prove that they did not crack, demonstrating their quality as usable iron, and therefore valuable.
Iron-making remained a rural craft until development of the blast furnace probably in the Liege area of Belgium during 14th century. Fed by air-blast from water-powered bellows, temperatures up to 1150 degrees C could be achieved, sufficient to melt the iron, which was cast from the furnace into sand moulds to form finished products, or into blocks (called “pigs”) for conversion to wrought iron.
At these higher temperatures about 4.25% carbon combined with the iron, making it brittle. Much was therefore refined (in a “finery”) to produce the purer, softer, forgeable wrought iron which was considered much more useful than brittle cast iron.
Blast furnaces increased the availability of cast and wrought iron, but depended on charcoal as fuel. Shortages of timber and competition from other users made charcoal increasingly scarce in the seventeenth century. Coal could not be used due to the deleterious effect of its impurities on the iron. Then, in 1709 Abraham Darby used coke (purified coal) in his blast furnace at Coalbrookdale, Shropshire. Coke was found to support a larger charge of iron ore/limestone than charcoal did, and allowed blast-air to pass more freely, so blast furnaces could be made bigger and more efficient.
However charcoal was still needed in the fineries to convert pig iron to wrought iron, and shortages continued. (Coal could not be used as its sulphur content caused brittleness in the iron at high temperatures.)
In 1784 Henry Cort developed a furnace at Funtley, Hampshire, where coal was burned separate from the pig iron, its heat being reflected, or reverberated off the roof. The charge was stirred (puddled) until almost all the carbon was burned out by combining with oxygen, then pulled from the furnace, hammered and rolled. After further heating, hammering and rolling it was finally rolled to a wide range of finished sections in grooved rolls also developed by Henry Cort.
John Wilkinson developed a steam-powered furnace blower in 1776, and in 1794 a cupola furnace to re-melt pig iron with coke in foundries away from the blast-furnace site.
Thus, in the 18th century iron manufacture developed from a charcoal-dependent woodland craft into a coal-based industry. Freed from charcoal shortages, and fuelled by the increasing demand of the industrial revolution, the production of both cast and wrought iron grew dramatically in the 18th. century.

NB: More information on iron is available in the Commercial Topic

GROWTH IN IRON PRODUCTION: (Cast & Wrought Iron, approximate tonnages)
1720 35,000 tons per year 99% using charcoal, 1% coke
1796 250,000 tons per year 6% using charcoal, 94% coke
The nineteenth century saw many further developments and improvements, including the heavy steam hammer invented by James Nasmyth in 1839. In 1856 Henry Bessemer developed a method of blowing air through, rather than over, molten pig iron to oxidise away its carbon in a tilting converter. This reduced the conversion time to minutes from the hours required for Cort’s puddling process, and produced steel which was stronger than wrought iron. The process was further improved by the Siemens- Martin open hearth process in the 1860’s, and the cost of steel plummeted.
The decline of wrought iron was then inevitable, and by 1900 its usage was small relative to that of steel, see table below. The last puddling furnace in the UK, Thomas Walmsley’s Atlas Forge in Bolton, Lancashire, finally closed in 1973. Its furnace, shingling hammer and rolling mill are now preserved at the Ironbridge Gorge Museum in Shropshire, an area associated with some of the world’s most important developments in the manufacture and use of iron.
DECLINE IN WROUGHT IRON PRODUCTION: Tons per annum WROUGHT IRON STEEL
  Wrought iron Steel
1870 3,000,000 250,000
1900 250,000 5,000,000
©2007 University of the West of England, Bristol
except where acknowledged
contents next