Audio & Midi Protocols checklist Flashcards
Name and identify key differences between a variety of cables and interconnections.
Single Core = unbalanced signal. core is +ve and shield -ve.
One pair = Balanced signal. It has Two cores, one +ve, another -ve. Shield is earthed.
XLR = Balanced signal. Pin 1 is earth, pin 2 +ve, pin 3 -ve.
RCA = unbalanced signal. They Carry audio+video. centre pin is +ve, outer ring is either -ve or shield.
Explain digital interconnection acronyms
S/PDIF = optical digital audio connection.
USB = transfer and response device.
SATA = connects host bus adapters to storage devices.
LAN = transmit audio signals between devices within a small area.
Suggest devices that use specific interconnections
S/PDIF = connect digital audio devices, such as CD players, audio interfaces, and sound cards
XLR Cable = used to connect microphones and audio interfaces to mixing consoles
RCA Cable = used to connect audio devices like turntables, and soundbars, for transmitting analog audio signals
USB = used to connect audio interfaces, MIDI controllers, recording and streaming
Thunderbolt = used to connect high-performance audio interfaces for high speed/low-latency data transferring
Summarize advantages, disadvantages or issues that relate to specific interconnections.
USB ad = simple, easy to use, highly compatible. dis = degradation over long distances
Thunderbolt ad = high speed/low-latency data transferring. dis = limited compatibility
S/PDIF ad = compatible with a wide range of digital audio equipment. dis = limited support for high-resolution audio formats
XLR Cable ad = low susceptibility to noise and interference. dis = bulkiness and inconvenience for portable use.
RCA Cable ad = compatible with a wide range of analog equipment. dis = susceptibility to noise and interference
Purposes for interconnections.
Audio signal transmission = mic to amp
Recording and playback = recorded/played back through DAW
Mixing and processing = compression, EQ etc.
Monitoring and feedback = audio fed back to performer
Connectivity/compatibility = multiple audio device compatibility
Explain compression types
ADC = data in audio wave’s reduced for transmission
ALC = dynamic range is reduced (limiter)
Lossless = decreases size by looking for patterns/probabilities (FLAC).
Lossy = strips unnoticeable elements; such as sound above 16khz (MP3)
Compare audio file formats – both from audio and theoretically.
WAV ad = lossless, uncompressed, high quality audio. dis = large file size
MP3 ad = lossy, compressed audio for streaming/portable players. dis = reduced audio quality
FLAC ad = lossless, compressed audio without lost audio quality. dis = large(ish) file size.
OGG ad = lossy, compressed audio for streaming/portable players. dis = limited support
Explain mp3 and similar file format make up; including what is contained in header, tag and data sections.
header = contains info like format, bit rate etc. sets up points throughout song, permiting things like fast forward/reverse.
tag = contains song info (artist, album, length etc.)
data = Contains audio data in the form of ‘bits’ that are compressed
Outline and exemplify the differences between Analogue and Digital.
signal = Analog signals are continuous and vary in an infinite number of ways. digital signals are discrete and represent the analog signal in a finite number of steps
quality = Analog = subject to noise/degradation over time. digital = opposite
storage = analog = physical storage (tape, vinyl). digital = convenient (online, cd, hard drive)
Explain concepts that relate to sampling
Sample rate = number of samples per second to represent the analog signal. [Ranges from 44.1khz (CD) to 192khz (high-res)]
Bit depth = the number of bits used to represent each sample. [Determines the dynamic range/amplitude levels.]
Aliasing = occurs when the sample rate is not high enough to accurately represent the analog signal. [Aliasing reduces distortion/low quality sound.]
Summarize the origins of Midi
Developed in the early 1980s as a way for electronic musical instruments to communicate with each other and with computers. Before MIDI, each manufacturer had its own proprietary system, making it difficult for musicians to use instruments from different manufacturers together. MIDI enabled the development of DAWs, MIDI sequencers, and other software/hardware tools for music production and performance.
Interpret Binary messages
((Read handout))
Connect and label up a typical midi and audio compatible workstation.
?
Explain and exemplify types of midi messages
Control Change messages: (CC) used to send continuous controller values, such as volume, pitch bend, or modulation, to a MIDI device. [They consist of three bytes: the status byte, the controller number, and the controller value. a CC message to set the volume of a synthesizer to maximum might look like this: 0xB0 0x07 0x7F.]
Program Change messages: used to change the sound or patch on a MIDI device. [consist of two bytes = status and the program number. a Program Change message to switch to the third preset on a synthesizer might look like this: 0xC0 0x02.]
System Exclusive messages (SysEx): used to send specific information or data from one midi device to another, such as firmware updates, sequences or samples. [begin with the status byte 0xF0 and end with the status byte 0xF7.]
Note On/Off messages: used to trigger a note on a MIDI device, while Note Off messages are used to release a note. [consist of three bytes = status, note number, velocity. a Note On message to play a C4 note at maximum velocity might look like this: 0x90 0x3C 0x7F.]
Relate Midi values to a variety of Midi messages
?