Liking cljdoc? Tell your friends :D

web.audio.AnalyserNode

The AnalyserNode interface represents a node able to provide frequency and time-domain analysis information. It is an web.audio.AudioNode passes the audio stream unchanged from the input to the output, allows you to take the generated data, process it, and create visualizations.

The AnalyserNode interface represents a node able to provide
frequency and time-domain analysis information. It is an `web.audio.AudioNode`
passes the audio stream unchanged from the input to the output,
allows you to take the generated data, process it, and create
visualizations.
raw docstring

web.audio.AudioBuffer

The AudioBuffer interface represents a short audio asset residing memory, created from an audio file using the AudioContext.decodeAudioData() or from raw data using AudioContext.createBuffer(). Once put an AudioBuffer, the audio can then be played by being passed an web.audio.AudioBufferSourceNode.

The AudioBuffer interface represents a short audio asset residing
memory, created from an audio file using the `AudioContext.decodeAudioData()`
or from raw data using `AudioContext.createBuffer()`. Once put
an AudioBuffer, the audio can then be played by being passed
an `web.audio.AudioBufferSourceNode`.
raw docstring

web.audio.AudioBufferSourceNode

The AudioBufferSourceNode interface is an web.audio.AudioScheduledSourceNode represents an audio source consisting of in-memory audio data, in an web.audio.AudioBuffer. It's especially useful for playing audio which has particularly stringent timing accuracy requirements, as for sounds that must match a specific rhythm and can be kept memory rather than being played from disk or the network.

The AudioBufferSourceNode interface is an `web.audio.AudioScheduledSourceNode`
represents an audio source consisting of in-memory audio data,
in an `web.audio.AudioBuffer`. It's especially useful for playing
audio which has particularly stringent timing accuracy requirements,
as for sounds that must match a specific rhythm and can be kept
memory rather than being played from disk or the network.
raw docstring

web.audio.AudioContext

The AudioContext interface represents an audio-processing graph from audio modules linked together, each represented by an web.audio.AudioNode.

The AudioContext interface represents an audio-processing graph
from audio modules linked together, each represented by an `web.audio.AudioNode`.
raw docstring

web.audio.AudioContextOptions

The AudioContextOptions dictionary is used to specify configuration when constructing a new web.audio.AudioContext object to represent graph of web audio nodes.

The AudioContextOptions dictionary is used to specify configuration
when constructing a new `web.audio.AudioContext` object to represent
graph of web audio nodes.
raw docstring

web.audio.AudioDestinationNode

AudioDestinationNode has no output (as it is the output, no more can be linked after it in the audio graph) and one input. The of channels in the input must be between 0 and the maxChannelCount or an exception is raised.

AudioDestinationNode has no output (as it is the output, no more
can be linked after it in the audio graph) and one input. The
of channels in the input must be between 0 and the maxChannelCount
or an exception is raised.
raw docstring

web.audio.AudioListener

The AudioListener interface represents the position and orientation the unique person listening to the audio scene, and is used in spatialization. All web.audio.PannerNodes spatialize in relation the AudioListener stored in the BaseAudioContext.listener attribute.

The AudioListener interface represents the position and orientation
the unique person listening to the audio scene, and is used in
spatialization. All `web.audio.PannerNode`s spatialize in relation
the AudioListener stored in the `BaseAudioContext.listener` attribute.
raw docstring

web.audio.AudioNode

The AudioNode interface is a generic interface for representing audio processing module. Examples include:

The AudioNode interface is a generic interface for representing
audio processing module. Examples include:
raw docstring

web.audio.AudioParam

The Web Audio API's AudioParam interface represents an audio-related usually a parameter of an web.audio.AudioNode (such as GainNode.gain).

The Web Audio API's AudioParam interface represents an audio-related
usually a parameter of an `web.audio.AudioNode` (such as `GainNode.gain`).
raw docstring

web.audio.AudioParamDescriptor

The AudioParamDescriptor dictionary of the Web Audio API specifies for an web.audio.AudioParam objects.

The AudioParamDescriptor dictionary of the Web Audio API specifies
for an `web.audio.AudioParam` objects.
raw docstring

web.audio.AudioParamMap

The Web Audio API interface AudioParamMap represents a set of audio parameters, each described as a mapping of a web.dom.DOMString the parameter to the web.audio.AudioParam object representing value.

The Web Audio API interface AudioParamMap represents a set of
audio parameters, each described as a mapping of a `web.dom.DOMString`
the parameter to the `web.audio.AudioParam` object representing
value.
raw docstring

web.audio.AudioScheduledSourceNode

The AudioScheduledSourceNode interface—part of the Web Audio a parent interface for several types of audio source node interfaces share the ability to be started and stopped, optionally at specified Specifically, this interface defines the start() and stop() as well as the onended event handler.

The AudioScheduledSourceNode interface—part of the Web Audio
a parent interface for several types of audio source node interfaces
share the ability to be started and stopped, optionally at specified
Specifically, this interface defines the `start()` and `stop()`
as well as the `onended` event handler.
raw docstring

web.audio.AudioScheduledSourceNode.ev

AudioScheduledSourceNode Events.

AudioScheduledSourceNode Events.
raw docstring

web.audio.AudioTrack

The AudioTrack interface represents a single audio track from of the HTML media elements, <audio> or <video>.

The AudioTrack interface represents a single audio track from
of the HTML media elements, `<audio>` or `<video>`.
raw docstring

web.audio.AudioTrackList

The AudioTrackList interface is used to represent a list of the tracks contained within a given HTML media element, with each represented by a separate web.audio.AudioTrack object in the

The AudioTrackList interface is used to represent a list of the
tracks contained within a given HTML media element, with each
represented by a separate `web.audio.AudioTrack` object in the
raw docstring

web.audio.AudioTrackList.ev

AudioTrackList Events.

AudioTrackList Events.
raw docstring

web.audio.AudioWorkletGlobalScope

The AudioWorkletGlobalScope interface of the Web Audio API represents global execution context for user-supplied code, which defines web.audio.AudioWorkletProcessor-derived classes. Each web.audio.BaseAudioContext a single web.audio.AudioWorklet available under the audioWorklet which runs its code in a single AudioWorkletGlobalScope.

The AudioWorkletGlobalScope interface of the Web Audio API represents
global execution context for user-supplied code, which defines
`web.audio.AudioWorkletProcessor`-derived classes. Each `web.audio.BaseAudioContext`
a single `web.audio.AudioWorklet` available under the `audioWorklet`
which runs its code in a single AudioWorkletGlobalScope.
raw docstring

web.audio.AudioWorkletNode

The AudioWorkletNode interface of the Web Audio API represents base class for a user-defined web.audio.AudioNode, which can connected to an audio routing graph along with other nodes. It an associated web.audio.AudioWorkletProcessor, which does the audio processing in a Web Audio rendering thread.

The AudioWorkletNode interface of the Web Audio API represents
base class for a user-defined `web.audio.AudioNode`, which can
connected to an audio routing graph along with other nodes. It
an associated `web.audio.AudioWorkletProcessor`, which does the
audio processing in a Web Audio rendering thread.
raw docstring

web.audio.AudioWorkletNodeOptions

The AudioWorkletNodeOptions dictionary of the Web Audio API is to specify configuration options when constructing a new web.audio.AudioWorkletNode for custom audio processing.

The AudioWorkletNodeOptions dictionary of the Web Audio API is
to specify configuration options when constructing a new `web.audio.AudioWorkletNode`
for custom audio processing.
raw docstring

web.audio.AudioWorkletProcessor

The AudioWorkletProcessor interface of the Web Audio API represents audio processing code behind a custom web.audio.AudioWorkletNode. lives in the web.audio.AudioWorkletGlobalScope and runs on Web Audio rendering thread. In turn, an web.audio.AudioWorkletNode on it runs on the main thread.

The AudioWorkletProcessor interface of the Web Audio API represents
audio processing code behind a custom `web.audio.AudioWorkletNode`.
lives in the `web.audio.AudioWorkletGlobalScope` and runs on
Web Audio rendering thread. In turn, an `web.audio.AudioWorkletNode`
on it runs on the main thread.
raw docstring

web.audio.BiquadFilterNode

The BiquadFilterNode interface represents a simple low-order and is created using the AudioContext.createBiquadFilter() It is an web.audio.AudioNode that can represent different kinds filters, tone control devices, and graphic equalizers.

The BiquadFilterNode interface represents a simple low-order
and is created using the `AudioContext.createBiquadFilter()`
It is an `web.audio.AudioNode` that can represent different kinds
filters, tone control devices, and graphic equalizers.
raw docstring

web.audio.BlobEvent

The BlobEvent interface represents events associated with a web.files.Blob. blobs are typically, but not necessarily, associated with media

The BlobEvent interface represents events associated with a `web.files.Blob`.
blobs are typically, but not necessarily, associated with media
raw docstring

web.audio.ConstantSourceNode

The ConstantSourceNode interface—part of the Web Audio API—represents audio source (based upon web.audio.AudioScheduledSourceNode) output is single unchanging value. This makes it useful for cases which you need a constant value coming in from an audio source. addition, it can be used like a constructible web.audio.AudioParam automating the value of its offset or by connecting another to it; see Controlling multiple parameters with ConstantSourceNode.

The ConstantSourceNode interface—part of the Web Audio API—represents
audio source (based upon `web.audio.AudioScheduledSourceNode`)
output is single unchanging value. This makes it useful for cases
which you need a constant value coming in from an audio source.
addition, it can be used like a constructible `web.audio.AudioParam`
automating the value of its `offset` or by connecting another
to it; see Controlling multiple parameters with ConstantSourceNode.
raw docstring

web.audio.ConvolverNode

The ConvolverNode interface is an web.audio.AudioNode that a Linear Convolution on a given web.audio.AudioBuffer, often to achieve a reverb effect. A ConvolverNode always has exactly input and one output.

The ConvolverNode interface is an `web.audio.AudioNode` that
a Linear Convolution on a given `web.audio.AudioBuffer`, often
to achieve a reverb effect. A ConvolverNode always has exactly
input and one output.
raw docstring

web.audio.core

web.audio interfaces.

web.audio interfaces.
raw docstring

No vars found in this namespace.

web.audio.DelayNode

The DelayNode interface represents a delay-line; an web.audio.AudioNode module that causes a delay between the arrival of an input data its propagation to the output.

The DelayNode interface represents a delay-line; an `web.audio.AudioNode`
module that causes a delay between the arrival of an input data
its propagation to the output.
raw docstring

web.audio.DisplayMediaStreamConstraints

The DisplayMediaStreamConstraints dictionary is used to specify or not to include video and/or audio tracks in the web.streams.MediaStream be returned by getDisplayMedia(), as well as what type of processing be applied to the tracks.

The DisplayMediaStreamConstraints dictionary is used to specify
or not to include video and/or audio tracks in the `web.streams.MediaStream`
be returned by `getDisplayMedia()`, as well as what type of processing
be applied to the tracks.
raw docstring

web.audio.DynamicsCompressorNode

Inherits properties from its parent, web.audio.AudioNode.

Inherits properties from its parent, `web.audio.AudioNode`.
raw docstring

web.audio.GainNode

The GainNode interface represents a change in volume. It is an audio-processing module that causes a given gain to be applied the input data before its propagation to the output. A GainNode has exactly one input and one output, both with the same number channels.

The GainNode interface represents a change in volume. It is an
audio-processing module that causes a given gain to be applied
the input data before its propagation to the output. A GainNode
has exactly one input and one output, both with the same number
channels.
raw docstring

web.audio.HTMLAudioElement

The HTMLAudioElement interface provides access to the properties <audio> elements, as well as methods to manipulate them. It from the web.media.HTMLMediaElement interface.

The HTMLAudioElement interface provides access to the properties
`<audio>` elements, as well as methods to manipulate them. It
from the `web.media.HTMLMediaElement` interface.
raw docstring

web.audio.IIRFilterNode

The IIRFilterNode interface of the Web Audio API is a web.audio.AudioNode which implements a general infinite impulse response (IIR) filter; type of filter can be used to implement tone control devices graphic equalizers as well. It lets the parameters of the filter be specified, so that it can be tuned as needed.

The IIRFilterNode interface of the Web Audio API is a `web.audio.AudioNode`
which implements a general infinite impulse response (IIR) filter;
type of filter can be used to implement tone control devices
graphic equalizers as well. It lets the parameters of the filter
be specified, so that it can be tuned as needed.
raw docstring

web.audio.MediaCapabilities

The MediaCapabilities interface of the Media Capabilities API information about the decoding abilities of the device, system browser. The API can be used to query the browser about the decoding of the device based on codecs, profile, resolution, and bitrates. information can be used to serve optimal media streams to the and determine if playback should be smooth and power efficient.

The MediaCapabilities interface of the Media Capabilities API
information about the decoding abilities of the device, system
browser. The API can be used to query the browser about the decoding
of the device based on codecs, profile, resolution, and bitrates.
information can be used to serve optimal media streams to the
and determine if playback should be smooth and power efficient.
raw docstring

web.audio.MediaDeviceInfo

The MediaDevicesInfo interface contains information that describes single media input or output device.

The MediaDevicesInfo interface contains information that describes
single media input or output device.
raw docstring

web.audio.MediaDevices

The MediaDevices interface provides access to connected media devices like cameras and microphones, as well as screen sharing. essence, it lets you obtain access to any hardware source of data.

The MediaDevices interface provides access to connected media
devices like cameras and microphones, as well as screen sharing.
essence, it lets you obtain access to any hardware source of
data.
raw docstring

web.audio.MediaDevices.ev

MediaDevices Events.

MediaDevices Events.
raw docstring

web.audio.MediaError

The MediaError interface represents an error which occurred while media in an HTML media element based on web.media.HTMLMediaElement, as <audio> or <video>.

The MediaError interface represents an error which occurred while
media in an HTML media element based on `web.media.HTMLMediaElement`,
as `<audio>` or `<video>`.
raw docstring

web.audio.MediaKeys

The MediaKeys interface of EncryptedMediaExtensions API represents set of keys that an associated web.media.HTMLMediaElement can for decryption of media data during playback.

The MediaKeys interface of EncryptedMediaExtensions API represents
set of keys that an associated `web.media.HTMLMediaElement` can
for decryption of media data during playback.
raw docstring

web.audio.MediaKeySession

The MediaKeySession interface of the EncryptedMediaExtensions represents a context for message exchange with a content decryption (CDM).

The MediaKeySession interface of the EncryptedMediaExtensions
represents a context for message exchange with a content decryption
(CDM).
raw docstring

web.audio.MediaKeySystemAccess

The MediaKeySystemAccess interface of the EncryptedMediaExtensions provides access to a Key System for decryption and/or a content provider. You can request an instance of this object using the method.

The MediaKeySystemAccess interface of the EncryptedMediaExtensions
provides access to a Key System for decryption and/or a content
provider. You can request an instance of this object using the
method.
raw docstring

web.audio.MediaMetadata

The MediaMetadata interface of the the Media Session API provides a web page to provide rich media metadata for display in a platform

The MediaMetadata interface of the the Media Session API provides
a web page to provide rich media metadata for display in a platform
raw docstring

web.audio.MediaRecorder

The MediaRecorder interface of the MediaStream Recording API functionality to easily record media. It is created using the constructor.

The MediaRecorder interface of the MediaStream Recording API
functionality to easily record media. It is created using the
constructor.
raw docstring

web.audio.MediaRecorder.ev

MediaRecorder Events.

MediaRecorder Events.
raw docstring

web.audio.MediaRecorderErrorEvent

The MediaRecorderErrorEvent interface represents errors returned the MediaStream Recording API. It is an web.event.Event object encapsulates a reference to a web.dom.DOMException describing error that occurred.

The MediaRecorderErrorEvent interface represents errors returned
the MediaStream Recording API. It is an `web.event.Event` object
encapsulates a reference to a `web.dom.DOMException` describing
error that occurred.
raw docstring

web.audio.MediaSession

The MediaSession interface of the Media Session API allows a page to provide custom behaviors for standard media playback

The MediaSession interface of the Media Session API allows a
page to provide custom behaviors for standard media playback
raw docstring

web.audio.MediaSource

The MediaSource interface of the Media Source Extensions API a source of media data for an web.media.HTMLMediaElement object. MediaSource object can be attached to a web.media.HTMLMediaElement be played in the user agent.

The MediaSource interface of the Media Source Extensions API
a source of media data for an `web.media.HTMLMediaElement` object.
MediaSource object can be attached to a `web.media.HTMLMediaElement`
be played in the user agent.
raw docstring

web.audio.MediaStreamAudioDestinationNode

Inherits properties from its parent, web.audio.AudioNode.

Inherits properties from its parent, `web.audio.AudioNode`.
raw docstring

web.audio.MediaStreamAudioSourceNode

The MediaStreamAudioSourceNode interface is a type of web.audio.AudioNode operates as an audio source whose media is received from a web.streams.MediaStream using the WebRTC or Media Capture and Streams APIs.

The MediaStreamAudioSourceNode interface is a type of `web.audio.AudioNode`
operates as an audio source whose media is received from a `web.streams.MediaStream`
using the WebRTC or Media Capture and Streams APIs.
raw docstring

web.audio.MediaStreamAudioSourceOptions

The MediaStreamAudioSourceOptions dictionary provides configuration used when creating a web.audio.MediaStreamAudioSourceNode using constructor.

The MediaStreamAudioSourceOptions dictionary provides configuration
used when creating a `web.audio.MediaStreamAudioSourceNode` using
constructor.
raw docstring

web.audio.MediaStreamConstraints

The MediaStreamConstraints dictionary is used when calling getUserMedia() specify what kinds of tracks should be included in the returned and, optionally, to establish constraints for those tracks' settings.

The MediaStreamConstraints dictionary is used when calling `getUserMedia()`
specify what kinds of tracks should be included in the returned
and, optionally, to establish constraints for those tracks' settings.
raw docstring

web.audio.MediaStreamTrack

The MediaStreamTrack interface represents a single media track a stream; typically, these are audio or video tracks, but other types may exist as well.

The MediaStreamTrack interface represents a single media track
a stream; typically, these are audio or video tracks, but other
types may exist as well.
raw docstring

web.audio.MediaStreamTrack.ev

MediaStreamTrack Events.

MediaStreamTrack Events.
raw docstring

web.audio.MediaStreamTrackAudioSourceOptions

The MediaStreamTrackAudioSourceOptions dictionary is used when options to the MediaStreamTrackAudioSourceNode() constructor.

The MediaStreamTrackAudioSourceOptions dictionary is used when
options to the `MediaStreamTrackAudioSourceNode()` constructor.
raw docstring

web.audio.MediaTrackSettings

The MediaTrackSettings dictionary is used to return the current configured for each of a web.audio.MediaStreamTrack's settings. values will adhere as closely as possible to any constraints described using a web.streams.MediaTrackConstraints object set using applyConstraints(), and will adhere to the default for any properties whose constraints haven't been changed, or customized constraints couldn't be matched.

The MediaTrackSettings dictionary is used to return the current
configured for each of a `web.audio.MediaStreamTrack`'s settings.
values will adhere as closely as possible to any constraints
described using a `web.streams.MediaTrackConstraints` object
set using `applyConstraints()`, and will adhere to the default
for any properties whose constraints haven't been changed, or
customized constraints couldn't be matched.
raw docstring

web.audio.OfflineAudioCompletionEvent

The Web Audio API OfflineAudioCompletionEvent interface represents that occur when the processing of an web.audio.OfflineAudioContext terminated. The complete event implements this interface.

The Web Audio API OfflineAudioCompletionEvent interface represents
that occur when the processing of an `web.audio.OfflineAudioContext`
terminated. The complete event implements this interface.
raw docstring

web.audio.OfflineAudioContext

The OfflineAudioContext interface is an web.audio.AudioContext representing an audio-processing graph built from linked together In contrast with a standard web.audio.AudioContext, an OfflineAudioContext render the audio to the device hardware; instead, it generates as fast as it can, and outputs the result to an web.audio.AudioBuffer.

The OfflineAudioContext interface is an `web.audio.AudioContext`
representing an audio-processing graph built from linked together
In contrast with a standard `web.audio.AudioContext`, an OfflineAudioContext
render the audio to the device hardware; instead, it generates
as fast as it can, and outputs the result to an `web.audio.AudioBuffer`.
raw docstring

web.audio.OfflineAudioContext.ev

OfflineAudioContext Events.

OfflineAudioContext Events.
raw docstring

web.audio.OscillatorNode

The OscillatorNode interface represents a periodic waveform, as a sine wave. It is an web.audio.AudioScheduledSourceNode module that causes a specified frequency of a given wave to be effect, a constant tone.

The OscillatorNode interface represents a periodic waveform,
as a sine wave. It is an `web.audio.AudioScheduledSourceNode`
module that causes a specified frequency of a given wave to be
effect, a constant tone.
raw docstring

web.audio.OverconstrainedError

The OverconstrainedError interface of the Media Capture and Streams indicates that the set of desired capabilities for the current cannot currently be met. When this event is thrown on a MediaStreamTrack, is muted until either the current constraints can be established until satisfiable constraints are applied.

The OverconstrainedError interface of the Media Capture and Streams
indicates that the set of desired capabilities for the current
cannot currently be met. When this event is thrown on a MediaStreamTrack,
is muted until either the current constraints can be established
until satisfiable constraints are applied.
raw docstring

web.audio.PannerNode

A PannerNode always has exactly one input and one output: the can be mono or stereo but the output is always stereo (2 channels); can't have panning effects without at least two audio channels!

A PannerNode always has exactly one input and one output: the
can be mono or stereo but the output is always stereo (2 channels);
can't have panning effects without at least two audio channels!
raw docstring

web.audio.RTCDTMFSender

Listen to these events using addEventListener() or by assigning event listener to the oneventname property of this interface.

Listen to these events using `addEventListener()` or by assigning
event listener to the oneventname property of this interface.
raw docstring

web.audio.RTCDTMFSender.ev

RTCDTMFSender Events.

RTCDTMFSender Events.
raw docstring

web.audio.RTCIceTransport

The RTCIceTransport interface provides access to information the ICE transport layer over which the data is being sent and

The RTCIceTransport interface provides access to information
the ICE transport layer over which the data is being sent and
raw docstring

web.audio.RTCIceTransport.ev

RTCIceTransport Events.

RTCIceTransport Events.
raw docstring

web.audio.RTCPeerConnection

The RTCPeerConnection interface represents a WebRTC connection the local computer and a remote peer. It provides methods to to a remote peer, maintain and monitor the connection, and close connection once it's no longer needed.

The RTCPeerConnection interface represents a WebRTC connection
the local computer and a remote peer. It provides methods to
to a remote peer, maintain and monitor the connection, and close
connection once it's no longer needed.
raw docstring

web.audio.RTCRtpSender

The RTCRtpSender interface provides the ability to control and details about how a particular web.audio.MediaStreamTrack is and sent to a remote peer.

The RTCRtpSender interface provides the ability to control and
details about how a particular `web.audio.MediaStreamTrack` is
and sent to a remote peer.
raw docstring

web.audio.RTCSessionDescription

The RTCSessionDescription interface describes one end of a connection—or connection—and how it's configured. Each RTCSessionDescription of a description type indicating which part of the offer/answer process it describes and of the SDP descriptor of the session.

The RTCSessionDescription interface describes one end of a connection—or
connection—and how it's configured. Each RTCSessionDescription
of a description `type` indicating which part of the offer/answer
process it describes and of the SDP descriptor of the session.
raw docstring

web.audio.SourceBuffer

The SourceBuffer interface represents a chunk of media to be into an web.media.HTMLMediaElement and played, via a web.audio.MediaSource This can be made up of one or several media segments.

The SourceBuffer interface represents a chunk of media to be
into an `web.media.HTMLMediaElement` and played, via a `web.audio.MediaSource`
This can be made up of one or several media segments.
raw docstring

web.audio.SourceBufferList

The SourceBufferList interface represents a simple container for multiple web.audio.SourceBuffer objects.

The SourceBufferList interface represents a simple container
for multiple `web.audio.SourceBuffer` objects.
raw docstring

web.audio.StereoPannerNode

The pan property takes a unitless value between -1 (full left and 1 (full right pan). This interface was introduced as a much way to apply a simple panning effect than having to use a full

The `pan` property takes a unitless value between -1 (full left
and 1 (full right pan). This interface was introduced as a much
way to apply a simple panning effect than having to use a full
raw docstring

web.audio.TrackDefault

The TrackDefault interface provides a web.audio.SourceBuffer kind, label, and language information for tracks that do not this information in the initialization segments of a media chunk.

The TrackDefault interface provides a `web.audio.SourceBuffer`
kind, label, and language information for tracks that do not
this information in the initialization segments of a media chunk.
raw docstring

web.audio.TrackDefaultList

The TrackDefaultList interface represents a simple container for multiple web.audio.TrackDefault objects.

The TrackDefaultList interface represents a simple container
for multiple `web.audio.TrackDefault` objects.
raw docstring

web.audio.TrackEvent

The TrackEvent interface, which is part of the HTML DOM specification, used for events which represent changes to a set of available on an HTML media element; these events are addtrack and removetrack.

The TrackEvent interface, which is part of the HTML DOM specification,
used for events which represent changes to a set of available
on an HTML media element; these events are addtrack and removetrack.
raw docstring

web.audio.WaveShaperNode

A WaveShaperNode always has exactly one input and one output.

A WaveShaperNode always has exactly one input and one output.
raw docstring

cljdoc is a website building & hosting documentation for Clojure/Script libraries

× close