The BaseAudioContext interface of the Web Audio API acts as a
definition for online and offline audio-processing graphs, as
by audio.AudioContext
and web.OfflineAudioContext
respectively.
The BaseAudioContext interface of the Web Audio API acts as a definition for online and offline audio-processing graphs, as by `audio.AudioContext` and `web.OfflineAudioContext` respectively.
(audio-worklet this)
Property.
The audioWorklet read-only property of the web.BaseAudioContext
returns an instance of audio.AudioWorklet
that can be used
adding audio.AudioWorkletProcessor
-derived classes which implement
audio processing.
baseAudioContextInstance.audioWorklet;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/audioWorklet
Property. The audioWorklet read-only property of the `web.BaseAudioContext` returns an instance of `audio.AudioWorklet` that can be used adding `audio.AudioWorkletProcessor`-derived classes which implement audio processing. `baseAudioContextInstance.audioWorklet;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/audioWorklet`
(create-analyser this)
Method.
The createAnalyser() method of the web.BaseAudioContext
interface
an web.AnalyserNode
, which can be used to expose audio time
frequency data and create data visualisations.
var analyserNode = baseAudioContext.createAnalyser();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createAnalyser
Method. The createAnalyser() method of the `web.BaseAudioContext` interface an `web.AnalyserNode`, which can be used to expose audio time frequency data and create data visualisations. `var analyserNode = baseAudioContext.createAnalyser();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createAnalyser`
(create-biquad-filter this)
Method.
A web.BiquadFilterNode
.
baseAudioContext.createBiquadFilter();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBiquadFilter
Method. A `web.BiquadFilterNode`. `baseAudioContext.createBiquadFilter();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBiquadFilter`
(create-buffer this num-ofchannels length sample-rate)
Method.
An audio.AudioBuffer
configured based on the specified options.
var buffer = baseAudioContext.createBuffer(numOfchannels, length, sampleRate);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBuffer
Method. An `audio.AudioBuffer` configured based on the specified options. `var buffer = baseAudioContext.createBuffer(numOfchannels, length, sampleRate);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBuffer`
(create-buffer-source this)
Method.
An audio.AudioBufferSourceNode
.
var source = baseAudioContext.createBufferSource();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBufferSource
Method. An `audio.AudioBufferSourceNode`. `var source = baseAudioContext.createBufferSource();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBufferSource`
(create-channel-merger this number-of-inputs)
Method.
A web.ChannelMergerNode
.
baseAudioContext.createChannelMerger(numberOfInputs);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createChannelMerger
Method. A `web.ChannelMergerNode`. `baseAudioContext.createChannelMerger(numberOfInputs);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createChannelMerger`
(create-channel-splitter this number-of-outputs)
Method.
A web.ChannelSplitterNode
.
baseAudioContext.createChannelSplitter(numberOfOutputs);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createChannelSplitter
Method. A `web.ChannelSplitterNode`. `baseAudioContext.createChannelSplitter(numberOfOutputs);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createChannelSplitter`
(create-constant-source this)
Method.
The createConstantSource() property of the web.BaseAudioContext
creates a web.ConstantSourceNode
object, which is an audio
that continuously outputs a monaural (one-channel) sound signal
samples all have the same value.
var constantSourceNode = AudioContext.createConstantSource()
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createConstantSource
Method. The createConstantSource() property of the `web.BaseAudioContext` creates a `web.ConstantSourceNode` object, which is an audio that continuously outputs a monaural (one-channel) sound signal samples all have the same value. `var constantSourceNode = AudioContext.createConstantSource()` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createConstantSource`
(create-convolver this)
Method.
A web.ConvolverNode
.
baseAudioContext.createConvolver();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createConvolver
Method. A `web.ConvolverNode`. `baseAudioContext.createConvolver();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createConvolver`
(create-delay this max-delay-time)
Method.
The createDelay() method of the web.BaseAudioContext
Interface
used to create a web.DelayNode
, which is used to delay the
audio signal by a certain amount of time.
var delayNode = audioCtx.createDelay(maxDelayTime);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createDelay
Method. The createDelay() method of the `web.BaseAudioContext` Interface used to create a `web.DelayNode`, which is used to delay the audio signal by a certain amount of time. `var delayNode = audioCtx.createDelay(maxDelayTime);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createDelay`
(create-dynamics-compressor this)
Method.
Compression lowers the volume of the loudest parts of the signal raises the volume of the softest parts. Overall, a louder, richer, fuller sound can be achieved. It is especially important in games musical applications where large numbers of individual sounds played simultaneously, where you want to control the overall level and help avoid clipping (distorting) of the audio output.
baseAudioCtx.createDynamicsCompressor();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createDynamicsCompressor
Method. Compression lowers the volume of the loudest parts of the signal raises the volume of the softest parts. Overall, a louder, richer, fuller sound can be achieved. It is especially important in games musical applications where large numbers of individual sounds played simultaneously, where you want to control the overall level and help avoid clipping (distorting) of the audio output. `baseAudioCtx.createDynamicsCompressor();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createDynamicsCompressor`
(create-gain this)
Method.
A web.GainNode
which takes as input one or more audio sources
outputs audio whose volume has been adjusted in gain (volume)
a level specified by the node's web.GainNode.gain
a-rate parameter.
var gainNode = AudioContext.createGain();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createGain
Method. A `web.GainNode` which takes as input one or more audio sources outputs audio whose volume has been adjusted in gain (volume) a level specified by the node's `web.GainNode.gain` a-rate parameter. `var gainNode = AudioContext.createGain();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createGain`
(create-iir-filter this feedforward feedback)
Method.
The createIIRFilter() method of the web.BaseAudioContext
interface
an web.IIRFilterNode
, which represents a general infinite impulse
(IIR) filter which can be configured to serve as various types
filter.
var iirFilter = AudioContext.createIIRFilter(feedforward, feedback);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createIIRFilter
Method. The createIIRFilter() method of the `web.BaseAudioContext` interface an `web.IIRFilterNode`, which represents a general infinite impulse (IIR) filter which can be configured to serve as various types filter. `var iirFilter = AudioContext.createIIRFilter(feedforward, feedback);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createIIRFilter`
(create-oscillator this)
Method.
The createOscillator() method of the web.BaseAudioContext
interface
an web.OscillatorNode
, a source representing a periodic waveform.
basically generates a constant tone.
var oscillatorNode = audioCtx.createOscillator();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createOscillator
Method. The createOscillator() method of the `web.BaseAudioContext` interface an `web.OscillatorNode`, a source representing a periodic waveform. basically generates a constant tone. `var oscillatorNode = audioCtx.createOscillator();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createOscillator`
(create-panner this)
Method.
The panner node is spatialized in relation to the AudioContext's
(defined by the audio.AudioContext.listener
attribute), which
the position and orientation of the person listening to the audio.
baseAudioCtx.createPanner();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createPanner
Method. The panner node is spatialized in relation to the AudioContext's (defined by the `audio.AudioContext.listener` attribute), which the position and orientation of the person listening to the audio. `baseAudioCtx.createPanner();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createPanner`
(create-periodic-wave this & args)
Method.
The createPeriodicWave() method of the web.BaseAudioContext
is used to create a web.PeriodicWave
, which is used to define
periodic waveform that can be used to shape the output of an
var wave = AudioContext.createPeriodicWave(real, imag[, constraints]);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createPeriodicWave
Method. The createPeriodicWave() method of the `web.BaseAudioContext` is used to create a `web.PeriodicWave`, which is used to define periodic waveform that can be used to shape the output of an `var wave = AudioContext.createPeriodicWave(real, imag[, constraints]);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createPeriodicWave`
(create-script-processor this
buffer-size
number-of-input-channels
number-of-output-channels)
Method.
A web.ScriptProcessorNode
.
var scriptProcessor = audioCtx.createScriptProcessor(bufferSize, numberOfInputChannels, numberOfOutputChannels);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createScriptProcessor
Method. A `web.ScriptProcessorNode`. `var scriptProcessor = audioCtx.createScriptProcessor(bufferSize, numberOfInputChannels, numberOfOutputChannels);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createScriptProcessor`
(create-stereo-panner this)
Method.
A web.StereoPannerNode
.
baseAudioContext.createStereoPanner();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createStereoPanner
Method. A `web.StereoPannerNode`. `baseAudioContext.createStereoPanner();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createStereoPanner`
(create-wave-shaper this)
Method.
A web.WaveShaperNode
.
baseAudioCtx.createWaveShaper();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createWaveShaper
Method. A `web.WaveShaperNode`. `baseAudioCtx.createWaveShaper();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createWaveShaper`
(current-time this)
Property.
The currentTime read-only property of the web.BaseAudioContext
returns a double representing an ever-increasing hardware timestamp
seconds that can be used for scheduling audio playback, visualizing
etc. It starts at 0.
var curTime = baseAudioContext.currentTime;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/currentTime
Property. The currentTime read-only property of the `web.BaseAudioContext` returns a double representing an ever-increasing hardware timestamp seconds that can be used for scheduling audio playback, visualizing etc. It starts at 0. `var curTime = baseAudioContext.currentTime;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/currentTime`
(decode-audio-data this & args)
Method.
This is the preferred method of creating an audio source for Audio API from an audio track. This method only works on complete data, not fragments of audio file data.
`Older callback syntax:
baseAudioContext.decodeAudioData(ArrayBuffer, successCallback, errorCallback);
Newer promise-based syntax:
Promise<decodedData> baseAudioContext.decodeAudioData(ArrayBuffer);`
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData
Method. This is the preferred method of creating an audio source for Audio API from an audio track. This method only works on complete data, not fragments of audio file data. `Older callback syntax: baseAudioContext.decodeAudioData(ArrayBuffer, successCallback, errorCallback); Newer promise-based syntax: Promise<decodedData> baseAudioContext.decodeAudioData(ArrayBuffer);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData`
(destination this)
Property.
An audio.AudioDestinationNode
.
baseAudioContext.destination;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/destination
Property. An `audio.AudioDestinationNode`. `baseAudioContext.destination;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/destination`
(listener this)
Property.
An audio.AudioListener
object.
baseAudioContext.listener;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/listener
Property. An `audio.AudioListener` object. `baseAudioContext.listener;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/listener`
(onstatechange this)
Property.
The following snippet is taken from our AudioContext states demo
it running live.) The onstatechange hander is used to log the
web.state
to the console every time it changes.
baseAudioContext.onstatechange = function() { ... };
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/onstatechange
Property. The following snippet is taken from our AudioContext states demo it running live.) The onstatechange hander is used to log the `web.state` to the console every time it changes. `baseAudioContext.onstatechange = function() { ... };` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/onstatechange`
(sample-rate this)
Property.
The sampleRate property of the web.BaseAudioContext
interface
a floating point number representing the sample rate, in samples
second, used by all nodes in this audio context.
baseAudioContext.sampleRate;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/sampleRate
Property. The sampleRate property of the `web.BaseAudioContext` interface a floating point number representing the sample rate, in samples second, used by all nodes in this audio context. `baseAudioContext.sampleRate;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/sampleRate`
(set-audio-worklet! this val)
Property.
The audioWorklet read-only property of the web.BaseAudioContext
returns an instance of audio.AudioWorklet
that can be used
adding audio.AudioWorkletProcessor
-derived classes which implement
audio processing.
baseAudioContextInstance.audioWorklet;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/audioWorklet
Property. The audioWorklet read-only property of the `web.BaseAudioContext` returns an instance of `audio.AudioWorklet` that can be used adding `audio.AudioWorkletProcessor`-derived classes which implement audio processing. `baseAudioContextInstance.audioWorklet;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/audioWorklet`
(set-current-time! this val)
Property.
The currentTime read-only property of the web.BaseAudioContext
returns a double representing an ever-increasing hardware timestamp
seconds that can be used for scheduling audio playback, visualizing
etc. It starts at 0.
var curTime = baseAudioContext.currentTime;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/currentTime
Property. The currentTime read-only property of the `web.BaseAudioContext` returns a double representing an ever-increasing hardware timestamp seconds that can be used for scheduling audio playback, visualizing etc. It starts at 0. `var curTime = baseAudioContext.currentTime;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/currentTime`
(set-destination! this val)
Property.
An audio.AudioDestinationNode
.
baseAudioContext.destination;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/destination
Property. An `audio.AudioDestinationNode`. `baseAudioContext.destination;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/destination`
(set-listener! this val)
Property.
An audio.AudioListener
object.
baseAudioContext.listener;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/listener
Property. An `audio.AudioListener` object. `baseAudioContext.listener;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/listener`
(set-onstatechange! this val)
Property.
The following snippet is taken from our AudioContext states demo
it running live.) The onstatechange hander is used to log the
web.state
to the console every time it changes.
baseAudioContext.onstatechange = function() { ... };
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/onstatechange
Property. The following snippet is taken from our AudioContext states demo it running live.) The onstatechange hander is used to log the `web.state` to the console every time it changes. `baseAudioContext.onstatechange = function() { ... };` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/onstatechange`
(set-sample-rate! this val)
Property.
The sampleRate property of the web.BaseAudioContext
interface
a floating point number representing the sample rate, in samples
second, used by all nodes in this audio context.
baseAudioContext.sampleRate;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/sampleRate
Property. The sampleRate property of the `web.BaseAudioContext` interface a floating point number representing the sample rate, in samples second, used by all nodes in this audio context. `baseAudioContext.sampleRate;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/sampleRate`
(set-state! this val)
Property.
A dom.DOMString
. Possible values are:
baseAudioContext.state;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/state
Property. A `dom.DOMString`. Possible values are: `baseAudioContext.state;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/state`
(state this)
Property.
A dom.DOMString
. Possible values are:
baseAudioContext.state;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/state
Property. A `dom.DOMString`. Possible values are: `baseAudioContext.state;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/state`
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close