The BaseAudioContext interface of the Web Audio API acts as a
definition for online and offline audio-processing graphs, as
by web.audio.AudioContext
and web.audio.OfflineAudioContext
The BaseAudioContext interface of the Web Audio API acts as a definition for online and offline audio-processing graphs, as by `web.audio.AudioContext` and `web.audio.OfflineAudioContext`
(audio-worklet this)
Property.
[Read Only] [Experimental]
The audioWorklet read-only property of the web.audio.BaseAudioContext
returns an instance of web.audio.AudioWorklet
that can be used
adding web.audio.AudioWorkletProcessor
-derived classes which
custom audio processing.
baseAudioContextInstance.audioWorklet;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/audioWorklet
Property. [Read Only] [Experimental] The audioWorklet read-only property of the `web.audio.BaseAudioContext` returns an instance of `web.audio.AudioWorklet` that can be used adding `web.audio.AudioWorkletProcessor`-derived classes which custom audio processing. `baseAudioContextInstance.audioWorklet;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/audioWorklet`
(create-analyser this)
Method.
The createAnalyser() method of the web.audio.BaseAudioContext
creates an web.audio.AnalyserNode
, which can be used to expose
time and frequency data and create data visualisations.
var analyserNode = baseAudioContext.createAnalyser();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createAnalyser
Method. The createAnalyser() method of the `web.audio.BaseAudioContext` creates an `web.audio.AnalyserNode`, which can be used to expose time and frequency data and create data visualisations. `var analyserNode = baseAudioContext.createAnalyser();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createAnalyser`
(create-biquad-filter this)
Method.
A web.audio.BiquadFilterNode
.
baseAudioContext.createBiquadFilter();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBiquadFilter
Method. A `web.audio.BiquadFilterNode`. `baseAudioContext.createBiquadFilter();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBiquadFilter`
(create-buffer this num-ofchannels length sample-rate)
Method.
An web.audio.AudioBuffer
configured based on the specified
var buffer = baseAudioContext.createBuffer(numOfchannels, length, sampleRate);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBuffer
Method. An `web.audio.AudioBuffer` configured based on the specified `var buffer = baseAudioContext.createBuffer(numOfchannels, length, sampleRate);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBuffer`
(create-buffer-source this)
Method.
An web.audio.AudioBufferSourceNode
.
var source = baseAudioContext.createBufferSource();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBufferSource
Method. An `web.audio.AudioBufferSourceNode`. `var source = baseAudioContext.createBufferSource();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createBufferSource`
(create-channel-merger this number-of-inputs)
Method.
A web.audio.ChannelMergerNode
.
baseAudioContext.createChannelMerger(numberOfInputs);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createChannelMerger
Method. A `web.audio.ChannelMergerNode`. `baseAudioContext.createChannelMerger(numberOfInputs);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createChannelMerger`
(create-channel-splitter this number-of-outputs)
Method.
A web.audio.ChannelSplitterNode
.
baseAudioContext.createChannelSplitter(numberOfOutputs);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createChannelSplitter
Method. A `web.audio.ChannelSplitterNode`. `baseAudioContext.createChannelSplitter(numberOfOutputs);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createChannelSplitter`
(create-constant-source this)
Method.
The createConstantSource() property of the web.audio.BaseAudioContext
creates a web.audio.ConstantSourceNode
object, which is an
source that continuously outputs a monaural (one-channel) sound
whose samples all have the same value.
var constantSourceNode = AudioContext.createConstantSource()
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createConstantSource
Method. The createConstantSource() property of the `web.audio.BaseAudioContext` creates a `web.audio.ConstantSourceNode` object, which is an source that continuously outputs a monaural (one-channel) sound whose samples all have the same value. `var constantSourceNode = AudioContext.createConstantSource()` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createConstantSource`
(create-convolver this)
Method.
A web.audio.ConvolverNode
.
baseAudioContext.createConvolver();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createConvolver
Method. A `web.audio.ConvolverNode`. `baseAudioContext.createConvolver();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createConvolver`
(create-delay this max-delay-time)
Method.
The createDelay() method of the web.audio.BaseAudioContext
is used to create a web.audio.DelayNode
, which is used to delay
incoming audio signal by a certain amount of time.
var delayNode = audioCtx.createDelay(maxDelayTime);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createDelay
Method. The createDelay() method of the `web.audio.BaseAudioContext` is used to create a `web.audio.DelayNode`, which is used to delay incoming audio signal by a certain amount of time. `var delayNode = audioCtx.createDelay(maxDelayTime);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createDelay`
(create-dynamics-compressor this)
Method.
Compression lowers the volume of the loudest parts of the signal raises the volume of the softest parts. Overall, a louder, richer, fuller sound can be achieved. It is especially important in games musical applications where large numbers of individual sounds played simultaneously, where you want to control the overall level and help avoid clipping (distorting) of the audio output.
baseAudioCtx.createDynamicsCompressor();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createDynamicsCompressor
Method. Compression lowers the volume of the loudest parts of the signal raises the volume of the softest parts. Overall, a louder, richer, fuller sound can be achieved. It is especially important in games musical applications where large numbers of individual sounds played simultaneously, where you want to control the overall level and help avoid clipping (distorting) of the audio output. `baseAudioCtx.createDynamicsCompressor();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createDynamicsCompressor`
(create-gain this)
Method.
A web.audio.GainNode
which takes as input one or more audio
and outputs audio whose volume has been adjusted in gain (volume)
a level specified by the node's GainNode.gain
a-rate parameter.
var gainNode = AudioContext.createGain();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createGain
Method. A `web.audio.GainNode` which takes as input one or more audio and outputs audio whose volume has been adjusted in gain (volume) a level specified by the node's `GainNode.gain` a-rate parameter. `var gainNode = AudioContext.createGain();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createGain`
(create-iir-filter this feedforward feedback)
Method.
The createIIRFilter() method of the web.audio.BaseAudioContext
creates an web.audio.IIRFilterNode
, which represents a general
impulse response (IIR) filter which can be configured to serve
various types of filter.
var iirFilter = AudioContext.createIIRFilter(feedforward, feedback);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createIIRFilter
Method. The createIIRFilter() method of the `web.audio.BaseAudioContext` creates an `web.audio.IIRFilterNode`, which represents a general impulse response (IIR) filter which can be configured to serve various types of filter. `var iirFilter = AudioContext.createIIRFilter(feedforward, feedback);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createIIRFilter`
(create-oscillator this)
Method.
The createOscillator() method of the web.audio.BaseAudioContext
creates an web.audio.OscillatorNode
, a source representing
periodic waveform. It basically generates a constant tone.
var oscillatorNode = audioCtx.createOscillator();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createOscillator
Method. The createOscillator() method of the `web.audio.BaseAudioContext` creates an `web.audio.OscillatorNode`, a source representing periodic waveform. It basically generates a constant tone. `var oscillatorNode = audioCtx.createOscillator();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createOscillator`
(create-panner this)
Method.
The panner node is spatialized in relation to the AudioContext's
(defined by the AudioContext.listener
attribute), which represents
position and orientation of the person listening to the audio.
baseAudioCtx.createPanner();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createPanner
Method. The panner node is spatialized in relation to the AudioContext's (defined by the `AudioContext.listener` attribute), which represents position and orientation of the person listening to the audio. `baseAudioCtx.createPanner();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createPanner`
(create-periodic-wave this & args)
Method.
The createPeriodicWave() method of the web.audio.BaseAudioContext
is used to create a web.audio.PeriodicWave
, which is used to
a periodic waveform that can be used to shape the output of an
var wave = AudioContext.createPeriodicWave(real, imag[, constraints]);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createPeriodicWave
Method. The createPeriodicWave() method of the `web.audio.BaseAudioContext` is used to create a `web.audio.PeriodicWave`, which is used to a periodic waveform that can be used to shape the output of an `var wave = AudioContext.createPeriodicWave(real, imag[, constraints]);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createPeriodicWave`
(create-script-processor this
buffer-size
number-of-input-channels
number-of-output-channels)
Method.
[Deprecated]
A web.deprecated.ScriptProcessorNode
.
var scriptProcessor = audioCtx.createScriptProcessor(bufferSize, numberOfInputChannels, numberOfOutputChannels);
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createScriptProcessor
Method. [Deprecated] A `web.deprecated.ScriptProcessorNode`. `var scriptProcessor = audioCtx.createScriptProcessor(bufferSize, numberOfInputChannels, numberOfOutputChannels);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createScriptProcessor`
(create-stereo-panner this)
Method.
A web.audio.StereoPannerNode
.
baseAudioContext.createStereoPanner();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createStereoPanner
Method. A `web.audio.StereoPannerNode`. `baseAudioContext.createStereoPanner();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createStereoPanner`
(create-wave-shaper this)
Method.
A web.audio.WaveShaperNode
.
baseAudioCtx.createWaveShaper();
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createWaveShaper
Method. A `web.audio.WaveShaperNode`. `baseAudioCtx.createWaveShaper();` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/createWaveShaper`
(current-time this)
Property.
[Read Only]
The currentTime read-only property of the web.audio.BaseAudioContext
returns a double representing an ever-increasing hardware timestamp
seconds that can be used for scheduling audio playback, visualizing
etc. It starts at 0.
var curTime = baseAudioContext.currentTime;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/currentTime
Property. [Read Only] The currentTime read-only property of the `web.audio.BaseAudioContext` returns a double representing an ever-increasing hardware timestamp seconds that can be used for scheduling audio playback, visualizing etc. It starts at 0. `var curTime = baseAudioContext.currentTime;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/currentTime`
(decode-audio-data this & args)
Method.
This is the preferred method of creating an audio source for Audio API from an audio track. This method only works on complete data, not fragments of audio file data.
`Older callback syntax:
baseAudioContext.decodeAudioData(ArrayBuffer, successCallback, errorCallback);
Newer promise-based syntax:
Promise<decodedData> baseAudioContext.decodeAudioData(ArrayBuffer);`
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData
Method. This is the preferred method of creating an audio source for Audio API from an audio track. This method only works on complete data, not fragments of audio file data. `Older callback syntax: baseAudioContext.decodeAudioData(ArrayBuffer, successCallback, errorCallback); Newer promise-based syntax: Promise<decodedData> baseAudioContext.decodeAudioData(ArrayBuffer);` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/decodeAudioData`
(destination this)
Property.
[Read Only]
An web.audio.AudioDestinationNode
.
baseAudioContext.destination;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/destination
Property. [Read Only] An `web.audio.AudioDestinationNode`. `baseAudioContext.destination;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/destination`
(listener this)
Property.
[Read Only]
An web.audio.AudioListener
object.
baseAudioContext.listener;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/listener
Property. [Read Only] An `web.audio.AudioListener` object. `baseAudioContext.listener;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/listener`
(onstatechange this)
Property.
The following snippet is taken from our AudioContext states demo
it running live.) The onstatechange hander is used to log the
state
to the console every time it changes.
baseAudioContext.onstatechange = function() { ... };
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/onstatechange
Property. The following snippet is taken from our AudioContext states demo it running live.) The onstatechange hander is used to log the `state` to the console every time it changes. `baseAudioContext.onstatechange = function() { ... };` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/onstatechange`
(sample-rate this)
Property.
[Read Only]
The sampleRate property of the web.audio.BaseAudioContext
interface
a floating point number representing the sample rate, in samples
second, used by all nodes in this audio context.
baseAudioContext.sampleRate;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/sampleRate
Property. [Read Only] The sampleRate property of the `web.audio.BaseAudioContext` interface a floating point number representing the sample rate, in samples second, used by all nodes in this audio context. `baseAudioContext.sampleRate;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/sampleRate`
(set-onstatechange! this val)
Property.
The following snippet is taken from our AudioContext states demo
it running live.) The onstatechange hander is used to log the
state
to the console every time it changes.
baseAudioContext.onstatechange = function() { ... };
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/onstatechange
Property. The following snippet is taken from our AudioContext states demo it running live.) The onstatechange hander is used to log the `state` to the console every time it changes. `baseAudioContext.onstatechange = function() { ... };` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/onstatechange`
(state this)
Property.
[Read Only]
A web.DOMString
. Possible values are:
baseAudioContext.state;
See also: https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/state
Property. [Read Only] A `web.DOMString`. Possible values are: `baseAudioContext.state;` See also: `https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/state`
cljdoc is a website building & hosting documentation for Clojure/Script libraries
× close