Semiconductors play a huge role in our everyday lives – without many of us realizing it. They’re used in a huge number of electronic devices, including smartphones, cameras, washing machines, fridges and computers. As well as these personal devices, semiconductors can be found throughout society – helping to operate public transport, cash machines and more.
Semiconductors are everywhere and have shaped the way we live and work. So, what exactly are they, and how did they come to be? Here, we explore the history of semiconductors in technology and what a semiconductor is.
What exactly is a semiconductor?
Semiconductors are a type of solid material that has a certain level of electrical conductivity between conductors (like some metals) and insulators (like rubber). Typically, semiconductors have a reduced resistance to the movement of an electrical current in one direction, making them ideal for use in electronic circuits. The level of conductivity can change in line with temperature or the number of impurities within the material. This alteration process is called doping. You can find out more about semiconductors and browse product ranges on sites like RS Components.
When were semiconductors first invented?
The first documented discovery of semiconductors was in 1833, when British physicist Michael Faraday discovered that silver sulphide’s electrical conductivity increased in line with increased temperature. Following Faraday’s discovery, Karl Braun observed the first semiconductor diode in 1874 at Würzburg University in Germany. His discovered that electrical currents flow in only one direction when he introduced contact between lead sulphide and metal wire.
These monumental discoveries led to the creation of the first patented semiconductor in the early 1900s. It was called the Galena Detector – and was invented when Jagadish Chandra Bose discovered that crystals found within semiconductor materials like lead sulphide have varying conductivity.
After this, according to the Nanotec Museum, 1947 saw the construction of the first every transistor by John Bardeen and Walter Brattain at Bell Labs. Their research continued and “integrated circuits (ICs) were invented that packed large numbers of transistors into a small chip, followed by large-scale integrated circuits (LSIs).”
From the mid-20th century, semiconductors began to be used increasingly in a range of different technologies. Transistor devices were introduced in radios to make them more efficient, single-chip microprocessors were born, and personal computers were introduced to the world. This was the beginning of the evolution of semiconductors and their role in society today.