msp programming and digital audio

msp Programming and Digital Audio

• msp is a set of digital audio extensions for Max. These extensions are a port of work done by Miller Puckette at IRCAM and UCSD to the Opcode Macintosh Max environment and have been programmed and marketed by David Zicarelli, the programmer of Opcode’s version of Max. The name msp has a couple of connotations: one is Max Signal Processing. Another is that msp are the initals of Miller Smith Puckette.

• Figure 1 shows a simple msp patch. All msp objects end with a twiddle (~) after the name of the object. Because it looks vaguely like a sine wave, the twiddle indicates that information coming into and/or out of the object is happening at the audio rate. Information in Max usually is sent whenever the user does something — plays in some MIDI, clicks with the mouse, etc. The fastest rate at which events can be scheduled to occur in Max is 1000 times per second. In audio, however, samples must be produced at the sampling rate for the sound to continue, and all msp objects update their outputs at the sampling rate (typically 44100 samples per second).


Figure 1: 440-hz oscillator

• The dac~ object at the bottom of Figure 1 is the digital-to-analog conversion object of msp. The two inlets at the top correspond to the left and right outputs from the sound system attached to the computer. This can be simply the stereo output from the computer itself or the outputs of a sound card (such as Digidesign gear) installed on the machine. The patch as a whole takes the audio output of a cycle~ object (a simple table-lookup oscillator), reduces the amplitude by multiplication, and sends it to the dac. The startwindow and stop messages to the dac turn on audio (for this window’s patch only) and turn it off, respectively.

• A sine wave is an example of simple harmonic motion. The wave completes one cycle of a simple back-and-forth motion at a constant rate. Because each cycle is completed in a constant amount of time, the motion of the wave is periodic. The number of cycles completed per second is the frequency of the wave, and the inverse of the frequency is its period. A wave that completes its cycle 100 times per second, then, has a frequency of 100 cycles per second (cps), also known as hertz (hz), and a period of 1/100 second, or 10 milliseconds.
• Sampling Theorem: To represent digitally a signal containing frequency components up to X Hz, it is necessary to use a sampling rate of at least 2X samples per second. If a signal has frequency components above one-half the sampling rate, these will be misrepresented in what is termed foldover, or aliasing. The frequency that is one-half the sampling rate is called the Nyquist frequency. Each frequency “has an alias equally far from the Nyquist frequency but on the other side of it. . . For this reason the Nyquist frequency is often called the folding frequency because we can think of frequencies above Nyquist as being folded down below Nyquist” [Steiglitz p. 47]. Simplifying this a bit, we can say that when an original frequency higher than one-half the sampling rate is sampled, it will produce a new frequency that is equal to the sampling frequency minus the original frequency.

• Samples in msp are interpreted as floating point values within the range –1.0 to +1.0. Samples conforming to that range occupy the full dynamic spectrum when sent to the dac~ object. Therefore, to attenuate the volume of a signal, the signal should be made to occupy a smaller range of values. In Figure 1 and Figure 2 below, the attenuation is performed by multiplying the output of cycle~ by a fractional value. Multiplication by a fraction is equivalent to division (another way to reduce the range of a signal) but is more economical to perform. Adding two signals together is equivalent to mixing them. Whenever more than one signal is mixed (added) together, take care that the combined output of the mixed sources does not exceed the range –1.0 to +1.0, because outside of that range the signal will clip.


Figure 2: Attenuation in msp

• To operate with signals digitally, we must discretize the waveform in two dimensions: in time (sampling) and in amplitude (quantizing). There are three steps to the conversion of an analog signal into a digital signal, the process called analog-to-digital conversion (ADC):

1) FILTER: A low-pass filter removes any frequency components of the signal exceeding one-half of the sampling rate.
2) MEASURE: A measurement is taken of the instantaneous amplitude of the signal at equally spaced intervals of time.
3) QUANTIZE: A quantizer assigns a precise numeric value to the measurement made in the previous step.

• The inverse process changes a digital representation to an analog one, and is called digital-to-analog conversion (DAC). In a DAC, voltage generators proportional to 2k volts are switched on when the corresponding bit k of the incoming digital representation is on. The steps of the DAC process are as follows:

1) TO VOLTAGE: The digital signal is converted to a time-varying voltage proportional to the sequence of numbers at the input.
2) TRANSIENT REMOVAL: “Glitches” introduced by step one are eliminated by ignoring fast transients.
3) FILTER: A low-pass filter set to half the sampling rate smoothes out the resultant analog signal.


Figure 3: DSP status window

• The DSP status window shows information about the configuration of digital audio on the Mac and the load a running msp program places on the central processing unit (CPU). Figure 3 shows the DSP status window during a typical execution of the simple oscillator patch of Figure 1. Notice that just running one oscillator uses almost 9% of the processing power of a Macintosh 8500/150 Power PC. The power of current CPUs to deliver digital audio directly is revolutionary, but actually using such applications quickly places a premium on processing speed.

• The cycle~ object is a table-lookup oscillator that uses a stored table of 512 samples. You can input your own sample tables or use the default sine wave. Cycle~ continuously outputs samples from the table at a frequency that corresponds to its argument (as in Figure 1) or to a value input to the left inlet.


Figure 4: cycle~ with variable frequency

• Figure 4 shows a patch with variable control over the oscillator frequency. The object line~ works like the object line, but at audio rates. Therefore the messages coming into line~ in Figure 4 will be changed into audio rate designations of frequency for the cycle~ object. Changing the value of the interpolation time into line (100 ms. in Figure 4) will change the speed with which the patch makes a portamento from one frequency to another.

• Note that the input to line~ is an ordinary Max message box. Some msp objects (such as line~ ) can take non-signal inputs and interpret these as controls over processes at the audio rate. The msp object sig~ explicitly upsamples a max value to an audio value; the object snapshot~ downsamples audio outputs to the Max range (a maximum of 1000 values per second).

• The average amplitude of a waveform is usually measured by the root-mean-square (rms) method. This works as it sounds: instantaneous measurements of amplitude are squared, summed, and averaged. The square root of the resulting number is the rms amplitude of the waveform.

• Noise is any unwanted signal added to the desired representation. Noise generally has a constant value (think of “hum”) and can be thought of as a lower limit to the range of useful signals. A commonly used measure of the presence of noise in a system is the signal-to-noise ratio (SNR), “which is usually defined as the ratio between the amplitudes of the largest useful signal and the amplitude of the inherent noise in a system.” Both amplitudes are expressed as rms values, and the SNR in decibels.
SNR (in dB) =  

• msp is very useful for experimenting with digital audio processing because so many DSP algorithms can be implemented quite directly using the objects msp provides. Rather than simply memorizing a formula such as the one above, you can use msp patches to try it out in practice. The patch shown in Figure 5 changes an amplitude value (varying between 0.0 and 1.0) to a value in decibels, using the formula shown above.


Figure 4: Amplitude to DB conversion

• MSP Exercise: Bring a Max/MSP patch to class February 26. Use MSP to implement one of the following digital signal processing techniques:
1) a low-pass filter
2) a reverb unit
3) a mixer
4) a flanger
NB: Don’t just copy the tutorial examples: do something in your patch that differs from the manual

Roads, C.. (1996) The Computer Music Tutorial. Cambridge, MA: The MIT Press.

Steiglitz, K. (1996) A Digital Signal Processing Primer New York: Addison-Wesley Publishing Company, Inc.

OS X용 Pro Tools 6.0이 나오면서 새로운 플러그인이 대거 선보일 예정이라네요….

OS X용 Pro Tools 6.0이 나오면서 새로운 플러그인이 대거 선보일 예정이라네요….
6.0에서는 기존의 플러그인이 사용이 불가하고
새로 업데이트된 버젼을 사용해야 한다는데….

우선 Eventide에서선보일 Omnipressor, Instant Phaser, H910,
Instant Flanger, H949.

기존의 Wave Mechanics에서 분사한 Sound Toys에서 출시할
강렬한 이름의 플러그인들.

Universal Audio에서 Pro Tools용으로 선보인 LA2A, 1176, Pultec, DreamVerb. BombFactory와 비교해선 어떤 소리를 들려줄지…

Bombfactory에서 나온 LA3A와 Essential씨리즈…

Sony사에서 출시한 EQ, Maximizer, Dynamics

VST, Direct X계열 플러그인들을 주로 만들었던
PSP Audioware와 Sonic TimeWorks에서 나올 제품들.

AudioEase에서 나온 샘플링 리버브 Altiverb

Serato에서도 몇몇 새 제품이….

그외에도 각종 신디사이저및 Loop Editor등이 꽤 선을 보일 예정 이랍니다

프로툴스 오디오 인터페이스 칼리브레이션하기

프로툴스 오디오 인터페이스 칼리브레이션하기
1. 프로툴스 프로그램을 실행합니다.
2. File > New Session 새로운 세션을 만듭니다.
3. Setups > Preference > Operations를 선택합니다.
4. 칼리브레이션 레퍼런스 레벨을 입력합니다. AES의 표준 규격은 -18dB입니다.
5. Done을 클릭합니다.
6. File > New Track 새로운 트랙을 만듭니다.
7. 인서트에 Signal Generator 플러그인을 걸어줍니다.
8. 시그널 제너레이터의 아웃풋을 설정합니다.
   이 값은 칼리브레이션 레퍼런스 레벨과  같아야 합니다.
9. 시그널제너레이터의 프리퀀시를 1000Hz로 설정합니다.
10. 시그널제너레이터의 시그널 웨이폼을 Sine으로 설정합니다.
11. 트랙의 아웃풋을 Bus 1번으로 만들어 주세요.
12. 오디오 인터페이스의 아웃풋 갯수 만큼 모노 억스트랙을 만듭니다.
    억스의 아웃풋은 각각 오디오 인터페이스의 아웃풋으로 나갈 수 있도록 합니다.
13. 억스트랙의 인풋을 Bus 1번으로 설정합니다.
14. 오디오 인터페이스의 인풋 갯수 만큼 모노 억스 트랙을 만듭니다.
    억스 트랙의 인풋을 각각 오디오 인터페이스의 인풋으로 설정합니다.
    다음 아웃풋을 사용하지 않는 버스로 보내 버립니다.
    이것은 피드백을 방지하기 위한 것으로 버스 31,32번을 선택해 주면 됩니다.
15. 오디오 인터페이스의 아웃풋을 VU 미터기의 인풋에 연결합니다.
    VU 미터기가 한 개라면 한번에 한 개씩 칼리브레이션 할 수 있겠지요?
16. 쉬프트+옵션+ 프로툴스의 모든 페이더를 클릭하여 0dB로 맞춥니다.
17. VU 미터와 연결된 오디오 인터페이스의 아웃풋 트림포트를
    십자 드라이버로 돌려 VU 미터의 레벨이 “0VU” 에 오도록 조정합니다.

트림 포트는 192I/O는 리어 패널에 888은 프로트 패널에 있습니다.
96I/O에는 없습니다.

18. 아웃풋의 1번은 인풋 1번으로, 아웃풋의 2번을 인풋 2번으로..    
     같은 방식으로 아웃풋과 인풋을 케이블로 연결합니다.
19. 프로툴스로 돌아와서 Operation > Calibration을 선택,
    칼리브레이션 모드로 들어갑니다.

칼리브 레이트되지 않은 트랙의 이름이 깜박이게 됩니다.
트랙 볼륨 인디케이터에는 칼리브레이트된 아웃풋에서 나온 시그널이 들어오는 것이
보일 것입니다.

20. 각각의 인풋 트림포트를 십자 드라이버로 돌려 레퍼런스 레벨과 동일하게
     맞추어 줍니다. 레퍼런스 레벨과 동일하게 맞추어 준다면 트랙의 이름이
     깜박이는 것을 멈추게 됩니다.
21. 칼리브레이션이 끝났습니다.
Operation > Calibration Mode를 다시 클릭하여 칼리브레이션 모드를 종료합니다.