How three decades have changed the study of sound
Human beings talk. There has never been a time since the beginning of humanity that we, as a species, have not stopped talking. At any moment or any day, there are human beings talking, even though there may be times when no one is listening. In linguistics, the study of the sounds we use to incessantly chatter away is a dual discipline called Phonetics and Phonology.
Overly simplified, Phonetics studies how we make the sounds we use to talk to each other while Phonology studies how we hear, and possibly understand, the sounds of speech. Phonetics and phonology are considered one of the fundamental sub-disciplines of our study of language, Linguistics.
If we look at the descriptions for these sub-disciplines, we may conclude that nothing much has changed in the teaching and learning of these sub-disciplines over the last three decades. This would be a gross misconception. However, one thing that had not changed is the notion held by many students that this is a killer subject: the twin sub-disciplines are generally offered as one subject.
When I first encountered phonetics and phonology, we had books that describe these sounds, their production, and their perception. Accompanying these books, we had cassette tapes that we had to have copied from the original by the resource centre. Of course, we had to provide our own tapes.
To familiarise ourselves with the individual sounds, we would play – stop – play the cassette repeatedly. This also meant that we often went through several tapes over the term. We had terms back then, not semesters.
Students today have interactive International Phonetic Association symbols charts online. Just click on the symbols and the sound for the symbol will play. You can do this all day long or as long as you need to know the sound and symbol well.
You can try your hand on it too at, https://www.ipachart.com/ and https://www.internationalphoneticalphabet.org/ipa-sounds/ipa-chart-with-sounds/#ipachartstart
We had to learn how to spell words ourselves and then verify whether we had phonetically spelt them right by referring to phonetic dictionaries. It was tedious but it was the only way we could do it. Today, online dictionaries entries are accompanied by IPA transcriptions.
Moreover, they provide the transcriptions in British English and American English. Better still, click on the corresponding buttons and the website plays the word for you. Have a go, at https://www.dictionary.com
Transcribing words into the IPA used to be a headscratcher. Today, students can have this done for them at websites like https://tophonetics.com
Artificial Intelligence systems can even take notes for you by transferring speech into text. Some are listed here, at https://otter.ai/blog/5-best-automatic-transcription-tools
Personally, I am waiting for someone to introduce a system that can transcribe speech into the IPA alphabet. The technology for this already exists. It is the reason Siri and Alexa can respond to your input. What is lacking is my knowledge of how to do it. If you have the knowledge, please, do create the system and I would be more than happy to help you test it.
In theory, today’s phonetics and phonology learners can learn the use of the IPA alphabet in a fraction of the time it took us several decades ago. You just need the will to do it and a good Internet connection.