diff --git a/docs/classes/AudioSession.html b/docs/classes/AudioSession.html index 5bff4e34..26c4ddc0 100644 --- a/docs/classes/AudioSession.html +++ b/docs/classes/AudioSession.html @@ -1,4 +1,4 @@ -
Static
configureStatic
getGets the available audio outputs for use with selectAudioOutput.
+Static
getGets the available audio outputs for use with selectAudioOutput.
startAudioSession must be called prior to using this method.
For Android, will return if available:
react-native-avroutepicker
for a native platform
control.
the available audio output types
-Static
selectSelect the provided audio output if available.
+Static
selectSelect the provided audio output if available.
startAudioSession must be called prior to using this method.
A deviceId retrieved from getAudioOutputs
-Static
setStatic
setDirectly change the AVAudioSession category/mode.
The configuration to use. Null values will be omitted and the existing values will be unchanged.
-Static
setStatic
setStatic
showStatic
showStatic
startStatic
stopStatic
startStatic
stopExperimental
Experimental
Experimental
Experimental
Beta
Visualizes audio signals from a TrackReference as bars. +
Beta
Visualizes audio signals from a TrackReference as bars.
If the state
prop is set, it automatically transitions between VoiceAssistant states.
For VoiceAssistant state transitions this component requires a voice assistant agent running with livekit-agents >= 0.9.0
function SimpleVoiceAssistant() {
const { state, audioTrack } = useVoiceAssistant();
return (
<BarVisualizer
state={state}
trackRef={audioTrack}
/>
);
}
-The LiveKitRoom
component provides the room context to all its child components.
+
The LiveKitRoom
component provides the room context to all its child components.
It is generally the starting point of your LiveKit app and the root of the LiveKit component tree.
It provides the room state as a React context to all child components, so you don't have to pass it yourself.
<LiveKitRoom
token='<livekit-token>'
serverUrl='<url-to-livekit-server>'
connect={true}
>
...
</LiveKitRoom>
-VideoTrack component for displaying video tracks in a React Native application. +
VideoTrack component for displaying video tracks in a React Native application. It supports both local and remote video tracks from LiveKit, and handles adaptive streaming for remote tracks.
See VideoTrackProps for details.
A React component that renders the given video track.
-use VideoTrack
and VideoTrackProps
instead.
use VideoTrack
and VideoTrackProps
instead.
Set the log level for both the @livekit/react-native
package and the @livekit-client
package.
+
Set the log level for both the @livekit/react-native
package and the @livekit-client
package.
To set the @livekit-client
log independently, use the liveKitClientLogLevel
prop on the options
object.
Default sort for participants, it'll order participants by:
+Default sort for participants, it'll order participants by:
Optional
localParticipant: defaultOptional
localParticipant: defaultHandles setting the appropriate AVAudioSession options automatically +
Handles setting the appropriate AVAudioSession options automatically depending on the audio track states of the Room.
Optional
onConfigureNativeAudio: ((trackState, preferSpeakerOutput) => AppleAudioConfiguration)A custom method for determining options used.
-A hook for tracking the volume of an audio track across multiple frequency bands.
+A hook for tracking the volume of an audio track across multiple frequency bands.
Optional
trackOrTrackReference: TrackReferenceOrPlaceholder | default | defaultOptional
options: MultiBandTrackVolumeOptionsA number array containing the volume for each frequency band.
-use useRemoteParticipant
or useLocalParticipant
instead
use useRemoteParticipant
or useLocalParticipant
instead
Experimental
Experimental
Optional
options: RoomOptionswrap your components in a
Optional
options: RoomOptionswrap your components in a
Use this SDK to add realtime video, audio and data features to your React Native app. By connecting to LiveKit Cloud or a self-hosted server, you can quickly build applications such as multi-modal AI, live streaming, or video calls with just a few lines of code.
@@ -28,16 +28,42 @@Once the @livekit/react-native-webrtc
dependency is installed, one last step is needed to finish the installation:
In your MainApplication.java file:
-import com.livekit.reactnative.LiveKitReactNative;
import com.livekit.reactnative.audio.AudioType;
public class MainApplication extends Application implements ReactApplication {
@Override
public void onCreate() {
// Place this above any other RN related initialization
// When AudioType is omitted, it'll default to CommunicationAudioType.
// Use MediaAudioType if user is only consuming audio, and not publishing.
LiveKitReactNative.setup(this, new AudioType.CommunicationAudioType());
//...
}
}
+Android
+
+Java
+
+In your MainApplication.java file:
+import com.livekit.reactnative.LiveKitReactNative;
import com.livekit.reactnative.audio.AudioType;
public class MainApplication extends Application implements ReactApplication {
@Override
public void onCreate() {
// Place this above any other RN related initialization
// When AudioType is omitted, it'll default to CommunicationAudioType.
// Use MediaAudioType if user is only consuming audio, and not publishing.
LiveKitReactNative.setup(this, new AudioType.CommunicationAudioType());
//...
}
}
-Or in your MainApplication.kt if you are using RN 0.73+
-Kotlin
import com.livekit.reactnative.LiveKitReactNative
import com.livekit.reactnative.audio.AudioType
class MainApplication : Application, ReactApplication() {
override fun onCreate() {
// Place this above any other RN related initialization
// When AudioType is omitted, it'll default to CommunicationAudioType.
// Use MediaAudioType if user is only consuming audio, and not publishing.
LiveKitReactNative.setup(this, AudioType.CommunicationAudioType())
//...
}
}
+
+
+
+
+Kotlin
+
+In your MainApplication.kt file:
+import com.livekit.reactnative.LiveKitReactNative
import com.livekit.reactnative.audio.AudioType
class MainApplication : Application, ReactApplication() {
override fun onCreate() {
// Place this above any other RN related initialization
// When AudioType is omitted, it'll default to CommunicationAudioType.
// Use MediaAudioType if user is only consuming audio, and not publishing.
LiveKitReactNative.setup(this, AudioType.CommunicationAudioType())
//...
}
}
-
-iOS
In your AppDelegate.m file:
+
+
+iOS
+
+Objective-C
+
+In your AppDelegate.m file:
#import "LivekitReactNative.h"
#import "WebRTCModuleOptions.h"
@implementation AppDelegate
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
// Place this above any other RN related initialization
[LivekitReactNative setup];
// Uncomment the following lines if you want to use the camera in the background
// Requires voip background mode and iOS 18+.
// WebRTCModuleOptions *options = [WebRTCModuleOptions sharedInstance];
// options.enableMultitaskingCameraAccess = YES;
//...
}
+
+
+
+
+Swift
+
+In your AppDelegate.swift file:
+import livekit_react_native
import livekit_react_native_webrtc
@main
class AppDelegate: UIResponder, UIApplicationDelegate {
func application(
_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]? = nil
) -> Bool {
// Place this above any other RN related initialization
LivekitReactNative.setup()
// Uncomment the following lines if you want to use the camera in the background
// Requires voip background mode and iOS 18+.
// let options = WebRTCModuleOptions.sharedInstance()
// options.enableMultitaskingCameraAccess = true
// ...
}
}
+
+
+
Expo
LiveKit is available on Expo through development builds. You can find our Expo plugin and setup instructions here.
Example app
You can try our standalone example app here.
Usage
In your index.js
file, setup the LiveKit SDK by calling registerGlobals()
.
@@ -129,4 +155,4 @@
Resources Docs · Example apps · Cloud · Self-hosting · CLI
-
Beta
Beta
Optional
barNumber of bars that show up in the visualizer
-Optional
optionsOptional
stateIf set, the visualizer will transition between different voice assistant states
-Optional
styleCustom React Native styles for the container.
-Optional
trackOptional
optionsOptional
stateIf set, the visualizer will transition between different voice assistant states
+Optional
styleCustom React Native styles for the container.
+Optional
trackOptional
audioPublish audio immediately after connecting to your LiveKit room.
Optional
connectIf set to true a connection to LiveKit room is initiated.
-false
Optional
connectDefine options how to connect to the LiveKit server.
+Optional
connectIf set to true a connection to LiveKit room is initiated.
+true
Optional
connectDefine options how to connect to the LiveKit server.
Optional
Experimental
featureOptional
onOptional
onOptional
onOptional
onOptional
onOptional
failure: MediaDeviceFailureOptional
optionsOptions for when creating a new room. +
Optional
Experimental
featureOptional
onOptional
onOptional
onOptional
onOptional
onOptional
failure: MediaDeviceFailureOptional
optionsOptions for when creating a new room. When you pass your own room instance to this component, these options have no effect. Instead, set the options directly in the room instance.
Optional
roomOptional room instance. +
Optional
roomOptional room instance.
By passing your own room instance you overwrite the options
parameter,
make sure to set the options directly on the room instance itself.
Optional
screenPublish screen share immediately after connecting to your LiveKit room.
+Optional
screenPublish screen share immediately after connecting to your LiveKit room.
false
https://docs.livekit.io/client-sdk-js/interfaces/ScreenShareCaptureOptions.html
-URL to the LiveKit server. +
URL to the LiveKit server.
For example: wss://<domain>.livekit.cloud
To simplify the implementation, undefined
is also accepted as an intermediate value, but only with a valid string url can the connection be established.
Optional
simulateA user specific access token for a client to authenticate to the room. +
Optional
simulateA user specific access token for a client to authenticate to the room.
This token is necessary to establish a connection to the room.
To simplify the implementation, undefined
is also accepted as an intermediate value, but only with a valid string token can the connection be established.
Optional
videoPublish video immediately after connecting to your LiveKit room.
+Optional
videoPublish video immediately after connecting to your LiveKit room.
Alpha
Interface for configuring options for the useMultibandTrackVolume hook.
-Alpha
Interface for configuring options for the useMultibandTrackVolume hook.
+Optional
bandsthe number of bands to split the audio into
-Optional
maxcut off frequency on the higher end
-Optional
mincut off frequency on the lower end
-Optional
updateupdate should run every x ms
-Optional
maxcut off frequency on the higher end
+Optional
mincut off frequency on the lower end
+Optional
updateupdate should run every x ms
+use useRemoteParticipant
or useLocalParticipant
instead
use useRemoteParticipant
or useLocalParticipant
instead
Optional
cameraOptional
metadataOptional
microphoneOptional
screenOptional
cameraOptional
metadataOptional
microphoneOptional
screenOptional
errorOptional
roomOptional
errorOptional
roomOptional
audioCorresponds to Android's AndroidAttributes content type.
+Optional
audioCorresponds to Android's AndroidAttributes content type.
Defaults to 'speech'.
See also https://developer.android.com/reference/android/media/AudioAttributes
Optional
audioCorresponds to Android's AudioAttributes usage type.
@@ -19,4 +19,4 @@Defaults to false.
Optional
manageWhether LiveKit should handle managing the audio focus or not.
Defaults to true.
-Optional
audioOptional
audioOptional
audioOptional
audioOptional
audioOptional
audioConfiguration for the underlying AudioSession.
+Configuration for the underlying AudioSession.
Android specific options:
By default, this is set to "speaker"
Optional
android?: { Optional
preferredOptional
ios?: { Optional
defaultOptional
android?: { Optional
preferredOptional
ios?: { Optional
defaultOptional
barOptional
barOptional
barOptional
maxdecimal values from 0 to 1
+Optional
barOptional
barOptional
barOptional
maxdecimal values from 0 to 1
Optional
mindecimal values from 0 to 1
-Optional
mirror?: booleanOptional
objectOptional
style?: ViewStyleOptional
videoOptional
zuse VideoTrack
and VideoTrackProps
instead.
Optional
mirror?: booleanOptional
objectOptional
style?: ViewStyleOptional
videoOptional
zuse VideoTrack
and VideoTrackProps
instead.
Options for construction an RNKeyProvider
-Optional
uncryptedOptions for construction an RNKeyProvider
+Optional
uncryptedOptional
keyOptional
keyProps for the VideoTrack component.
+Props for the VideoTrack component.
Optional
iosPIP?: RTCIOSPIPOptions & { Picture in picture options for this view. Disabled if not supplied.
iOS only. Requires iOS 15.0 or above, and the PIP background mode capability.
If iosPIP.enabled
is true, the methods startIOSPIP
and stopIOSPIP
@@ -31,4 +31,4 @@
application usually needs a maximum of two zOrder values: 0 for the
remote video(s) which appear in the background, and 1 for the local
video(s) which appear above the remote video(s).
Const
A pre-configured AndroidAudioConfiguration for voice communication.
+Const
A pre-configured AndroidAudioConfiguration for voice communication.
A pre-configured AndroidAudioConfiguration for media playback.
-Const
Const
Applies the provided audio configuration to the underlying AudioSession.
Must be called prior to connecting to a Room for the configuration to apply correctly.
See also useIOSAudioManagement for automatic configuration of iOS audio options.
-