Determining Percentage Mass of Copper in a Penny Using Spectrophotometry

In general, every material or molecule in the universe absorbs light of certain wavelengths. Based on this property, one can estimate the amount of the material. According to the Beer’s Law, the amount of light absorbance is directly proportional to the concentration of the material. This principle is currently being used in a wide range of fields starting from food industries, for example, the amount of glucose or proteins in food products, to pharmaceutical industries, for example, to evaluate the concentration of certain drug components during the production or the amount of a drug delivered to the cells after treatment. This method is also applied in analytical fields, for example, to determine the amounts of certain elements in soils or mines.

In the assay, the amount of copper in my penny was 0.93%. However, the standard value of copper in the penny is 2.5%. Based on this, the percent error is 236, which is quite high. The main sources of the error are wearing of the outer layer of copper during the cash transactions using the penny, the presence of zinc and nitric acid in the test solution and the variation in the amount of NH3 used. Other minor reasons may be the errors during making solutions, measuring and the quality of glassware used. The main reason I would assume is the loss of copper during the use of penny. The second reason is the presence of zinc, and nitric acid poses variations in the color of the copper solution, which leads to errors in light absorption. The third reason is the amount of NH3 used is more than 3 ml per 20+ ml of final volume compared to 2 ml per the standard solution, which make the color of the solution to be slightly different. I would not blame glassware for this errors. Instead, I would blame my handling for minor errors.

Scroll to Top