You have 3 free guides left 😟
Unlock your guides
You have 3 free guides left 😟
Unlock your guides

and are the backbone of modern music production. They enable musicians to create, record, and manipulate sound in powerful ways. Understanding these technologies is crucial for anyone looking to produce music or work in audio engineering.

MIDI allows for flexible control of virtual instruments and hardware, while digital audio captures real-world sounds. Together, they form the foundation of digital music creation, offering endless possibilities for composition, recording, and sound design.

MIDI basics

  • MIDI (Musical Instrument Digital Interface) is a protocol that allows electronic musical instruments, computers, and other devices to communicate and synchronize with each other
  • MIDI does not transmit actual audio, but rather sends event messages that specify musical parameters such as pitch, velocity, and timing
  • MIDI is widely used in music production, live performances, and music education as it enables the control and manipulation of various musical elements

MIDI messages

Top images from around the web for MIDI messages
Top images from around the web for MIDI messages
  • are the individual instructions sent between MIDI devices to control various aspects of sound generation and performance
  • /Off messages indicate when a note should start or stop playing and include information about pitch and velocity (how hard the note is played)
  • (CC) messages are used to adjust parameters such as volume, panning, modulation, and other effects in real-time
  • messages instruct a device to switch to a specific preset or patch, allowing for quick changes in sound

MIDI channels

  • MIDI supports 16 separate channels, allowing multiple devices to communicate independently within a single MIDI setup
  • Each MIDI channel can be assigned to a specific instrument or device, enabling individual control and layering of sounds
  • are essential for creating multi-timbral arrangements, where different parts or instruments can be played simultaneously using a single or sequencer

MIDI devices

  • MIDI controllers, such as keyboards, drum pads, and wind controllers, generate MIDI data when played and send it to other MIDI devices or software
  • MIDI sound modules and synthesizers receive MIDI data and convert it into audio, either through internal sound generation or by triggering external sound sources
  • connect MIDI devices to computers, allowing for the recording, editing, and playback of MIDI data within music software (digital audio workstations or DAWs)

MIDI sequencing

  • involves recording, editing, and arranging MIDI data to create complete musical compositions or performances
  • MIDI sequencers, which can be hardware devices or software applications (DAWs), allow users to capture and manipulate MIDI data in a timeline-based format
  • MIDI sequencing is a fundamental aspect of modern music production, enabling composers and producers to create complex, multi-layered arrangements with precise control over timing, pitch, and dynamics

MIDI tracks

  • In a MIDI sequencer, each MIDI channel or instrument is typically represented by a separate track
  • contain the recorded MIDI events, such as note data, controller messages, and program changes, arranged along a timeline
  • By organizing MIDI data into separate tracks, users can easily edit, mute, solo, or apply effects to individual parts or instruments within a composition

MIDI editing

  • involves modifying the recorded MIDI data to refine the performance, correct errors, or create new musical ideas
  • Common MIDI editing tasks include:
    1. : Aligning MIDI notes to a specific grid or timing resolution to correct timing inconsistencies
    2. : Adjusting the velocity values of MIDI notes to control dynamics and expression
    3. : Changing the pitch of individual notes or entire passages to correct errors or create melodic variations
    4. : Modifying the length of MIDI notes to adjust the timing or create staccato or legato articulations

MIDI quantization

  • Quantization is the process of aligning MIDI notes to a specific timing grid, such as 1/4 notes, 1/8 notes, or 1/16 notes
  • Quantizing MIDI data can help correct timing inconsistencies and create a more precise, rhythmically tight performance
  • Different quantization settings, such as grid resolution, swing, and strength, allow users to maintain a human feel while correcting timing issues
  • Advanced quantization features, like groove templates or adaptive quantization, can apply specific timing characteristics from one performance to another

MIDI controllers

  • MIDI controllers are hardware devices that generate MIDI data when played, allowing musicians to control various parameters of music software or hardware instruments
  • MIDI controllers come in various form factors and designs, each tailored to specific musical needs or preferences
  • Most MIDI controllers connect to computers or other MIDI devices via USB or traditional 5-pin

Keyboard controllers

  • Keyboard controllers are designed to resemble traditional piano keyboards and are the most common type of MIDI controller
  • They come in various sizes, ranging from compact 25-key models to full-size 88-key controllers with weighted keys that simulate the feel of an acoustic piano
  • Many keyboard controllers include additional features such as pitch and modulation wheels, assignable knobs and faders, and programmable pads for triggering samples or controlling effects

Drum pad controllers

  • Drum pad controllers, also known as MIDI drum pads or beat pads, are designed for programming and performing drum patterns, beats, and triggering samples
  • These controllers typically feature a grid of pressure-sensitive pads that can be assigned to different drum sounds or samples
  • Some drum pad controllers also include assignable knobs and faders for controlling parameters like volume, pitch, and effects
  • Popular examples of drum pad controllers include the Akai MPD series and the Native Instruments Maschine

Wind controllers

  • Wind controllers are designed to mimic the playing style and technique of traditional wind instruments, such as saxophones, clarinets, or flutes
  • These controllers typically feature a mouthpiece with breath sensors that detect air pressure and convert it into MIDI expression data, allowing for realistic control over dynamics and articulation
  • Wind controllers often include keys or buttons for selecting notes, as well as additional controls for pitch bending and vibrato
  • Examples of wind controllers include the Akai EWI (Electronic Wind Instrument) series and the Yamaha WX series

MIDI connectivity

  • refers to the various methods and technologies used to establish communication between MIDI devices and computers
  • Proper MIDI connectivity is essential for ensuring reliable data transfer and synchronization between devices in a MIDI setup
  • The most common MIDI connectivity options include MIDI interfaces, traditional MIDI cables, and

MIDI interfaces

  • MIDI interfaces are hardware devices that connect MIDI controllers and other MIDI devices to computers
  • They convert MIDI data into a format that can be understood by the computer's USB or Firewire port, and vice versa
  • MIDI interfaces typically feature one or more MIDI inputs and outputs, allowing for the connection of multiple MIDI devices
  • Some also include built-in MIDI I/O, eliminating the need for a separate MIDI interface

MIDI cables

  • Traditional MIDI cables, also known as 5-pin DIN cables, are used to connect MIDI devices directly to each other
  • These cables transmit MIDI data serially, with each device in the chain passing the data along to the next device
  • MIDI cables have a maximum recommended length of 15 meters (50 feet) to ensure reliable data transmission
  • When using MIDI cables, it's important to connect the MIDI Out of one device to the MIDI In of the next device in the chain

MIDI over USB

  • Many modern MIDI controllers and devices feature USB connectivity, allowing them to connect directly to computers without the need for a separate MIDI interface
  • MIDI over USB provides a simple, plug-and-play solution for connecting MIDI devices to computers, as the computer's USB port handles the data conversion
  • USB-equipped MIDI devices can also draw power from the computer's USB port, eliminating the need for separate power adapters
  • When using multiple USB MIDI devices with a computer, a USB hub may be necessary to provide additional USB ports

Digital audio fundamentals

  • Digital audio refers to the representation of sound using binary numbers, which can be stored, processed, and reproduced using computers and digital devices
  • Understanding the fundamental concepts of digital audio is essential for working with audio in a music production or sound engineering context
  • Key concepts in digital audio include the differences between analog and digital audio, , and

Analog vs digital audio

  • is a continuous representation of sound waves, where the audio signal varies continuously over time
  • In analog systems, sound is typically captured using microphones and stored on magnetic tape or vinyl records
  • Digital audio, on the other hand, represents sound using a series of discrete numerical values, sampled at regular intervals
  • Digital audio offers several advantages over analog, including improved noise reduction, easier editing and manipulation, and lossless duplication

Sample rate

  • Sample rate refers to the number of times per second that an analog audio signal is measured and converted into a digital value
  • The most common sample rates used in digital audio are 44.1 kHz (used for CDs) and 48 kHz (used for professional audio and video production)
  • Higher sample rates, such as 96 kHz or 192 kHz, can capture higher frequencies and provide more detailed audio representation, but also result in larger file sizes
  • The Nyquist-Shannon sampling theorem states that the sample rate must be at least twice the highest frequency in the audio signal to accurately represent it in the digital domain

Bit depth

  • Bit depth refers to the number of bits used to represent each sample in a digital audio signal
  • Common bit depths include 16-bit (used for CDs), 24-bit (used in professional audio production), and 32-bit (used in high-end audio processing)
  • Higher bit depths allow for a greater dynamic range and more precise representation of the audio signal
  • For example, 16-bit audio provides a dynamic range of 96 dB, while 24-bit audio offers a dynamic range of 144 dB
  • Higher bit depths also result in larger file sizes, as more data is required to represent each sample

Digital audio formats

  • Digital audio formats are standardized ways of encoding and storing digital audio data in files
  • Different audio formats offer various levels of audio quality, compression, and compatibility with different playback devices and software
  • Understanding the characteristics and limitations of common digital audio formats is important for managing and exchanging audio files in music production and distribution

WAV files

  • WAV (Waveform Audio File Format) is an uncompressed, high-quality audio format commonly used in professional audio production
  • can store audio at various sample rates and bit depths, making them suitable for recording, editing, and mastering
  • WAV files are compatible with most audio software and hardware devices, making them a widely-used format for audio exchange and backup
  • However, the uncompressed nature of WAV files results in larger file sizes compared to compressed formats

AIFF files

  • AIFF (Audio Interchange File Format) is another uncompressed, high-quality audio format, primarily used on Apple Macintosh systems
  • Like WAV files, can store audio at various sample rates and bit depths, providing high-quality audio representation
  • AIFF files are commonly used in professional audio production environments that rely on Apple hardware and software
  • AIFF files also have larger file sizes due to their uncompressed nature

MP3 files

  • MP3 (MPEG-1 Audio Layer 3) is a lossy compressed audio format that reduces file size by removing audio data that is considered less perceptible to the human ear
  • MP3 compression allows for much smaller file sizes compared to uncompressed formats like WAV or AIFF, making it popular for music distribution and streaming
  • can be encoded at various bitrates, with higher bitrates resulting in better audio quality but larger file sizes
  • While MP3 files are widely compatible and convenient for distribution, the lossy compression can result in a loss of audio quality, particularly at lower bitrates

Digital audio recording

  • Digital audio recording involves capturing sound using microphones or other audio sources and converting it into a digital format for storage, editing, and playback
  • Understanding the tools and techniques used in digital audio recording is crucial for achieving high-quality recordings in music production and sound engineering

Audio interfaces

  • Audio interfaces are hardware devices that connect microphones, instruments, and other audio sources to a computer for digital recording
  • They convert analog audio signals into digital data that can be processed by the computer's audio software (DAW)
  • Audio interfaces typically feature microphone preamps, line-level inputs, and outputs for connecting studio monitors and headphones
  • Important factors to consider when choosing an audio interface include the number and type of inputs and outputs, supported sample rates and bit depths, and compatibility with your computer and software

Microphone types

  • Microphones are essential tools for capturing sound in digital audio recording, and there are several types of microphones designed for different applications
  • are rugged, versatile, and well-suited for capturing loud sound sources like drums, guitar amplifiers, and live vocals
  • are more sensitive and capture a wider frequency range, making them ideal for recording vocals, acoustic instruments, and ambient sounds
  • offer a smooth, warm sound and are often used for recording brass instruments, guitar amplifiers, and as room mics for capturing ambience

Recording techniques

  • Proper microphone placement is crucial for achieving a desired sound and minimizing unwanted noise or room reflections
  • The 3:1 rule suggests that when using multiple microphones, the distance between each microphone should be at least three times the distance from the microphone to the sound source to avoid phase cancellation
  • Close miking involves placing the microphone very near the sound source, resulting in a dry, direct sound with minimal room ambience
  • Stereo recording techniques, such as XY, ORTF, and spaced pair, involve using two microphones to capture a wider, more spacious sound image
  • Using pop filters and shock mounts can help reduce plosives and handling noise when recording vocals or other sensitive sources

Digital audio editing

  • Digital audio editing involves manipulating and refining recorded audio using software tools to achieve a desired sound or creative effect
  • Modern digital audio workstations (DAWs) offer a wide range of editing features that allow for precise, of audio files
  • Understanding the fundamental concepts and techniques of digital audio editing is essential for shaping and polishing recordings in music production and post-production

Non-destructive editing

  • Non-destructive editing is a fundamental feature of modern DAWs that allows users to make changes to audio files without permanently altering the original data
  • When editing audio non-destructively, the DAW creates a set of instructions or "edits" that are applied to the original audio file in real-time during playback
  • This approach allows for unlimited undo/redo steps and the ability to revert to the original audio at any point in the editing process
  • Non-destructive editing provides flexibility and encourages experimentation, as users can freely try out different editing ideas without the risk of permanently damaging the original recordings

Audio regions

  • are portions of an audio file that can be selected, moved, copied, or edited independently within a DAW
  • By dividing an audio file into regions, users can easily rearrange, duplicate, or process specific sections of a recording without affecting the entire file
  • Regions can be created manually by selecting a portion of the audio waveform, or automatically by using tools like the "detect transients" or "strip silence" functions in a DAW
  • Many DAWs also support non-destructive region-based processing, allowing users to apply effects or modifications to individual regions without altering the original audio data

Fades and crossfades

  • are gradual increases or decreases in volume at the beginning (fade-in) or end (fade-out) of an audio region or file
  • Fades are used to create smooth transitions between sections of audio, eliminate clicks or pops, or to gradually introduce or remove a sound from the mix
  • are a type of fade that involves overlapping two audio regions and gradually transitioning from one to the other
  • Crossfades are commonly used to create seamless transitions between different takes or sections of a recording, or to blend two different sounds together
  • Most DAWs offer various fade and crossfade types, such as linear, logarithmic, or S-curve, each with its own characteristics and applications

MIDI vs digital audio

  • MIDI and digital audio are two fundamental technologies used in modern music production, each with its own strengths and limitations
  • Understanding the differences between MIDI and digital audio, as well as their respective advantages, is crucial for making informed decisions when creating and producing music

Advantages of MIDI

  • MIDI data is extremely lightweight compared to digital audio, as it only contains information about musical events and not the actual audio itself
  • This compact nature of MIDI allows for easy storage, manipulation, and transmission of complex musical arrangements
  • MIDI data can be easily edited, quantized, and rearranged without affecting the quality of the sound, as the actual audio is generated by the receiving device or software instrument
  • MIDI allows for flexible instrumentation, as the same MIDI data can be used to trigger different sounds or virtual instruments, making it easy to experiment with different timbres and arrangements
  • MIDI can be used to control and automate various parameters of software instruments and effects, such as pitch, volume, panning, and modulation

Advantages of digital audio

  • Digital audio provides a direct, high-quality representation of the actual sound, capturing the nuances, dynamics, and timbral characteristics of the original performance
  • Recording audio allows for the capture of live performances, acoustic instruments, and real-world sounds that cannot be easily replicated using MIDI or virtual instruments
  • Digital audio can be processed and manipulated using a wide range of effects and tools, such as EQ, compression, reverb, and time-stretching, to enhance or transform the sound
  • Audio recordings maintain their quality and character regardless of the playback device or software, ensuring a consistent listening experience across different systems
  • Digital audio is the standard format for final music distribution and consumption, as it can be easily shared, streamed, or pressed onto physical media like CDs or vinyl records

Combining MIDI and audio

  • In modern music production, MIDI and digital audio are often used together to create rich, layered, and dynamic arrangements
  • MIDI can be used to create and control virtual instrument tracks, such as drums, bass, or synths, while audio tracks can be used for recorded vocals, guitars, or other live instruments
  • MIDI can also be used to trigger
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.


© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.

© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.
Glossary
Glossary