-
-This diagram shows the flow of data from the Vision Camera through the `FrameProcessorPlugin` to the `CustomSource`, and finally to the Fishjam SDK.
-
-### Examples
-
-Here are examples illustrating how to implement the above flow for iOS and Android.
-
-
-
-
-
-Follow these steps to implement Vision Camera as a custom source on iOS:
-
-1. Create a CustomSource class that implements the required protocol:
-
-```swift
-import FishjamCloudClient
-
-class WebrtcVisionCameraCustomSource: CustomSource {
- var delegate: CustomSourceDelegate?
-
- let isScreenShare = false
- let metadata = ["type":"camera"].toMetadata()
- let videoParameters = VideoParameters.presetFHD43
-}
-```
-
-2. Create a FrameProcessorPlugin that will extract frames from Vision Camera and pass them to Fishjam SDK:
-
-```swift
-import VisionCamera
-
-public class WebrtcFrameProcessorPlugin: FrameProcessorPlugin {
- static var currentSource: WebrtcVisionCameraCustomSource?
-
- public override func callback(_ frame: Frame, withArguments arguments: [AnyHashable : Any]?) -> Any {
- if let customSource = WebrtcFrameProcessorPlugin.currentSource {
- customSource.delegate?.customSource(customSource, didOutputSampleBuffer: frame.buffer, rotation: .ninety)
- }
- return frame
- }
-}
-```
-
-3. Register the FrameProcessorPlugin with Vision Camera:
- - Follow the [official documentation on registering plugins](https://react-native-vision-camera.com/docs/guides/frame-processors-plugins-ios).
-
-4. Register the CustomSource with Fishjam SDK to create a new track:
-
-```swift
-let source = WebrtcVisionCameraCustomSource()
-
-WebrtcFrameProcessorPlugin.currentSource = source
-
-try await RNFishjamProxy.add(customSource: source)
-```
-
-
-
-
-Follow these steps to implement Vision Camera as a custom source on Android:
-
-1. Create a CustomSource class that implements the required interface:
-
-```kotlin
-import com.fishjamcloud.client.models.CustomSource
-import com.fishjamcloud.client.models.CustomSourceConsumer
-import com.fishjamcloud.client.models.VideoParameters
-import com.fishjamcloud.client.models.Metadata
-
-class WebrtcVisionCameraCustomSource: CustomSource {
- override val isScreenShare = false
- override val metadata: Metadata = mapOf("type" to "camera")
- override val videoParameters = VideoParameters.presetFHD43
-
- var consumer: CustomSourceConsumer? = null
- private set
-
- override fun initialize(consumer: CustomSourceConsumer) {
- this.consumer = consumer
- }
-}
-```
-
-2. Create a FrameProcessorPlugin that will extract frames from Vision Camera and pass them to Fishjam SDK:
-
-```kotlin
-import com.mrousavy.camera.frameprocessors.Frame
-import com.mrousavy.camera.frameprocessors.FrameProcessorPlugin
-import com.mrousavy.camera.frameprocessors.VisionCameraProxy
-
-class WebrtcFrameProcessorPlugin(proxy: VisionCameraProxy, options: Map?): FrameProcessorPlugin() {
- companion object {
- var currentSource: WebrtcVisionCameraCustomSource? = null
- }
-
- override fun callback(frame: Frame, arguments: Map?): Frame {
- currentSource?.consumer?.onImageProxyCaptured(frame.imageProxy)
- return frame
- }
-}
-```
-
-3. Register the FrameProcessorPlugin with Vision Camera:
- - Follow the official documentation on registering plugins [here](https://react-native-vision-camera.com/docs/guides/frame-processors-plugins-android).
-
-4. Register the CustomSource with Fishjam SDK to enable creating tracks using the new source:
-
-```kotlin
-val source = WebrtcVisionCameraCustomSource()
-
-WebrtcFrameProcessorPlugin.currentSource = source
-
-RNFishjamClient.createCustomSource(source)
-```
-
-
-
-
-
-#### Usage
-
-Depending on your React Native setup, create an interface for Javascript to interact with this code. If you're using Expo, we recommend using [Expo Modules](https://docs.expo.dev/modules/overview/). If you're using a bare React Native setup, we recommend using [Turbo Modules](https://reactnative.dev/docs/turbo-native-modules-introduction).
diff --git a/versioned_docs/version-0.22.0/how-to/react-native/installation.mdx b/versioned_docs/version-0.22.0/how-to/react-native/installation.mdx
deleted file mode 100644
index 3ff03257..00000000
--- a/versioned_docs/version-0.22.0/how-to/react-native/installation.mdx
+++ /dev/null
@@ -1,105 +0,0 @@
----
-sidebar_position: 1
----
-
-import Tabs from "@theme/Tabs";
-import TabItem from "@theme/TabItem";
-import InstallPackage from "./_components/install-package.mdx";
-import ConfigurePermissions from "./_components/configure-permissions.mdx";
-
-# Installation
-
-## Optional: Create a New App
-
-
- Follow these steps to create a new mobile app
-
-If you don't have an existing project, you can create a new Expo app using a template
-
-```bash
-npx create-expo-app@latest my-video-app
-```
-
-As the next step, you have to generate native files with the `expo prebuild` command:
-
-```bash
-npx expo prebuild
-```
-
-You can also follow more detailed [Expo instructions](https://docs.expo.dev/get-started/introduction/).
-
-
-
-## Step 1: Install the Package
-
-Install `@fishjam-cloud/react-native-client` with your preferred package manager.
-
-
-
-## Step 2: Configure App Permissions
-
-
-
-## Optional: Request Camera and Microphone Permissions
-
-:::info
-You don’t need to explicitly request permissions as they’re automatically asked for when your app needs them.
-:::
-
-If you want more control, you can use the `useCameraPermissions` and `useMicrophonePermissions` hooks to manage permissions manually. Both hooks return an array with three elements:
-
-1. `permission`: The current permission status
-2. `requestPermission`: A function to request permission
-3. `getPermission`: A function to get the current permission status
-
-Here's an example of how to use both hooks:
-
-```tsx
-import {
- useCameraPermissions,
- useMicrophonePermissions,
-} from "@fishjam-cloud/react-native-client";
-import { useEffect } from "react";
-
-const [cameraPermission, requestCameraPermission, getCameraPermission] =
- useCameraPermissions();
-
-const [
- microphonePermission,
- requestMicrophonePermission,
- getMicrophonePermission,
-] = useMicrophonePermissions();
-
-useEffect(() => {
- requestCameraPermission();
- requestMicrophonePermission();
-}, []);
-```
-
-**Permission Response**
-
-The `permission` object has the following properties:
-
-- `canAskAgain`: Indicates if the user can be asked again for this permission. If `false`, it is recommended to direct the user to Settings app to enable/disable the permission.
-- `expires`: When the permission expires.
-- `granted`: Indicates if the permission is granted.
-- `status`: The current status of the permission.
-
-:::info
-You can control when and how permissions are requested by passing an `options` object to the hook.
-:::
-
-### Customizing Permission Request Behavior
-
-By default, `get` is called automatically (auto fetch is `true`), and `request` is not (auto request is `false`). You can change this behavior:
-
-```tsx
-import { useCameraPermissions } from "@fishjam-cloud/react-native-client";
-// Do not auto-fetch permission status, enable auto-request
-const [permission, requestPermission] = useCameraPermissions({
- get: false, // disables auto-fetch
- request: true, // enables auto-request
-});
-```
-
-Adjust these options to fit your app's needs.
diff --git a/versioned_docs/version-0.22.0/how-to/react-native/list-other-peers.mdx b/versioned_docs/version-0.22.0/how-to/react-native/list-other-peers.mdx
deleted file mode 100644
index d1335555..00000000
--- a/versioned_docs/version-0.22.0/how-to/react-native/list-other-peers.mdx
+++ /dev/null
@@ -1,48 +0,0 @@
----
-sidebar_position: 4
----
-
-# List other peers
-
-In order to see other streaming peers, you can use [`usePeers`](../../api/mobile/functions/usePeers). It will return all
-other peers, together with the tracks that they are streaming.
-
-### Example code that show all videos
-
-```tsx
-import React from "react";
-import { View } from "react-native";
-import {
- usePeers,
- VideoRendererView,
-} from "@fishjam-cloud/react-native-client";
-
-export function ShowAllPeers() {
- const { remotePeers, localPeer } = usePeers(); // [!code highlight]
-
- const videoTracks = remotePeers.flatMap(
- (peer) =>
- peer.tracks.filter((track) => track.type === "Video" && track.isActive), // [!code highlight]
- );
- const localTrack = localPeer?.tracks.find((t) => t.type === "Video"); // [!code highlight]
-
- return (
-
- {localTrack && (
-
- )}
- {videoTracks.map((track) => (
-
- ))}
-
- );
-}
-```
diff --git a/versioned_docs/version-0.22.0/how-to/react-native/metadata.mdx b/versioned_docs/version-0.22.0/how-to/react-native/metadata.mdx
deleted file mode 100644
index 3c4bb899..00000000
--- a/versioned_docs/version-0.22.0/how-to/react-native/metadata.mdx
+++ /dev/null
@@ -1,101 +0,0 @@
----
-sidebar_position: 8
-title: "Metadata"
-description: "How to use metadata"
----
-
-import MetadataHeader from "../_common/metadata/header.mdx";
-import JoiningRoom from "../_common/metadata/joining_room.mdx";
-import UpdatingMetadata from "../_common/metadata/updating.mdx";
-import ReadingMetadata from "../_common/metadata/reading.mdx";
-
-
-
-
-
-```tsx
-const FISHJAM_ID = "fishjam-id";
-const PEER_TOKEN = "some-peer-token";
-// ---cut---
-import React, { useCallback } from "react";
-import { Button } from "react-native";
-import { useConnection } from "@fishjam-cloud/react-native-client";
-
-type PeerMetadata = {
- displayName: string;
-};
-
-export function JoinRoomButton() {
- const { joinRoom } = useConnection();
-
- const onPressJoin = useCallback(async () => {
- await joinRoom({
- fishjamId: FISHJAM_ID,
- peerToken: PEER_TOKEN,
- peerMetadata: { displayName: "John Wick" }, // [!code highlight]
- });
- }, [joinRoom]);
-
- return ;
-}
-```
-
-
-
-
-
-```tsx
-import React, { useCallback } from "react";
-import { Button } from "react-native";
-import { useUpdatePeerMetadata } from "@fishjam-cloud/react-native-client";
-
-type PeerMetadata = {
- displayName: string;
-};
-
-export function UpdateNameButton() {
- const { updatePeerMetadata } = useUpdatePeerMetadata(); // [!code highlight]
-
- const onPressUpdateName = useCallback(async () => {
- await updatePeerMetadata({ displayName: "Thomas A. Anderson" }); // [!code highlight]
- }, [updatePeerMetadata]);
-
- return ;
-}
-```
-
-
-
-
-
-```tsx
-import React from "react";
-import { Text, View } from "react-native";
-import { usePeers } from "@fishjam-cloud/react-native-client";
-
-type PeerMetadata = {
- displayName: string;
-};
-
-type ServerMetadata = {
- realName: string;
-};
-
-export function ListAllNames() {
- const { remotePeers } = usePeers(); // [!code highlight]
-
- return (
-
- {remotePeers.map((peer) => (
- // [!code highlight:4]
-
- Display name: {peer.metadata.peer?.displayName || "Unknown"}
- Real name: {peer.metadata.server?.realName || "Unknown"}
-
- ))}
-
- );
-}
-```
-
-
diff --git a/versioned_docs/version-0.22.0/how-to/react-native/start-streaming.mdx b/versioned_docs/version-0.22.0/how-to/react-native/start-streaming.mdx
deleted file mode 100644
index f3212a07..00000000
--- a/versioned_docs/version-0.22.0/how-to/react-native/start-streaming.mdx
+++ /dev/null
@@ -1,109 +0,0 @@
----
-sidebar_position: 3
----
-
-# Start streaming
-
-How to stream your camera
-
-:::tip[Enable devices before connecting]
-
-You can enable your camera and microphone before calling connect method.
-This way, you can show user camera preview. Once connect method is called,
-enabled camera and microphone will start streaming to Room.
-
-:::
-
-## Enable your camera
-
-First, you have to enable your camera by calling [`prepareCamera`](../../api/mobile/functions/useCamera#preparecamera) method.
-You can open show camera preview with [`VideoPreviewView`](../../api/mobile/variables/VideoPreviewView) component
-
-```tsx
-import React, { useEffect } from "react";
-import {
- useCamera,
- VideoPreviewView,
-} from "@fishjam-cloud/react-native-client";
-
-export function ViewPreview() {
- const { prepareCamera } = useCamera(); // [!code highlight]
-
- useEffect(() => {
- prepareCamera({ cameraEnabled: true }); // [!code highlight]
- }, [prepareCamera]);
-
- return ;
-}
-```
-
-### Listing user cameras
-
-To list all cameras available on device, you can use [`cameras`](../../api/mobile/functions/useCamera#cameras) property from [`useCamera`](../../api/mobile/functions/useCamera) hook.
-This way, you can either automatically choose camera (front/back) or allow user to select camera type.
-
-To change camera, simply call [`switchCamera`](../../api/mobile/functions/useCamera#switchcamera) method.
-
-```tsx
-import React, { useCallback } from "react";
-import { Button } from "react-native";
-import { useCamera } from "@fishjam-cloud/react-native-client";
-
-export function FlipButton() {
- const { cameras, switchCamera, currentCamera } = useCamera(); // [!code highlight]
-
- const onPressFlipCamera = useCallback(() => {
- // find first camera facing opposite direction than current camera
- const otherCamera = cameras.find(
- (camera) => camera.facingDirection !== currentCamera?.facingDirection,
- );
- if (otherCamera) {
- switchCamera(otherCamera.id); // [!code highlight]
- }
- }, [cameras, currentCamera?.facingDirection, switchCamera]);
-
- return ;
-}
-```
-
-### Disabling/enabling camera
-
-To change camera state, you use [`toggleCamera`](../../api/mobile/functions/useCamera#togglecamera) method.
-
-```tsx
-import { Button } from "react-native";
-import React from "react";
-import { useCamera } from "@fishjam-cloud/react-native-client";
-
-export function ToggleCameraButton() {
- const { isCameraOn, toggleCamera } = useCamera(); // [!code highlight]
-
- return (
-
- );
-}
-```
-
-## Enable microphone
-
-Microphone works similar to camera. In order to enable it, you have to call [`toggleMicrophone`](../../api/mobile/functions/useMicrophone#togglemicrophone) method.
-
-```tsx
-import { Button } from "react-native";
-import React from "react";
-import { useMicrophone } from "@fishjam-cloud/react-native-client";
-
-export function ToggleMicrophoneButton() {
- const { isMicrophoneOn, toggleMicrophone } = useMicrophone(); // [!code highlight]
-
- return (
-
- );
-}
-```
diff --git a/versioned_docs/version-0.22.0/how-to/react/_category_.json b/versioned_docs/version-0.22.0/how-to/react/_category_.json
deleted file mode 100644
index e4ca8099..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/_category_.json
+++ /dev/null
@@ -1,4 +0,0 @@
-{
- "label": "React/Web",
- "position": 2
-}
diff --git a/versioned_docs/version-0.22.0/how-to/react/_common/metadata/header.mdx b/versioned_docs/version-0.22.0/how-to/react/_common/metadata/header.mdx
deleted file mode 100644
index 1602f006..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/_common/metadata/header.mdx
+++ /dev/null
@@ -1,10 +0,0 @@
-Alongside audio and video, it is possible to send additional metadata with each peer. Metadata is just
-JSON that can contain arbitrary information. Its most common use is sending a user name associated with a peer.
-However, it can be also used to send the peer's camera type, application information etc.
-
-:::info
-
-You can also set metadata on [the server side, when adding user to the room](../backend/server-setup#metadata). This metadata is persistent throughout its lifetime and is useful for attaching information that
-can't be overwritten by the peer, like information about real user names or basic permission info.
-
-:::
diff --git a/versioned_docs/version-0.22.0/how-to/react/_common/metadata/joining_room.mdx b/versioned_docs/version-0.22.0/how-to/react/_common/metadata/joining_room.mdx
deleted file mode 100644
index 517a5b30..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/_common/metadata/joining_room.mdx
+++ /dev/null
@@ -1,5 +0,0 @@
-## Setting metadata when joining the room
-
-The `joinRoom` method from the `useConnection` hook has a `peerMetadata` parameter, that can be used for setting object metadata.
-
-{props.children}
diff --git a/versioned_docs/version-0.22.0/how-to/react/_common/metadata/reading.mdx b/versioned_docs/version-0.22.0/how-to/react/_common/metadata/reading.mdx
deleted file mode 100644
index b0516650..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/_common/metadata/reading.mdx
+++ /dev/null
@@ -1,9 +0,0 @@
-## Reading metadata
-
-Peer metadata is available as the `metadata` property for each peer. Therefore, when you list your peers with the `usePeers` hook, you can read
-the metadata associated with them.
-Note that the `metadata.peer` property contains only the metadata set by the client SDK (as in the examples examples above).
-The metadata set on the server side is available as `metadata.server`.
-Learn more about server metadata [here](../backend/server-setup#metadata).
-
-{props.children}
diff --git a/versioned_docs/version-0.22.0/how-to/react/_common/metadata/updating.mdx b/versioned_docs/version-0.22.0/how-to/react/_common/metadata/updating.mdx
deleted file mode 100644
index 1a56a0f6..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/_common/metadata/updating.mdx
+++ /dev/null
@@ -1,5 +0,0 @@
-## Updating metadata during connection
-
-Once you've joined the room, you can update your peer metadata with `updatePeerMetadata` from `useUpdatePeerMetadata`:
-
-{props.children}
diff --git a/versioned_docs/version-0.22.0/how-to/react/connecting.mdx b/versioned_docs/version-0.22.0/how-to/react/connecting.mdx
deleted file mode 100644
index ca3f0d28..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/connecting.mdx
+++ /dev/null
@@ -1,52 +0,0 @@
----
-sidebar_position: 2
----
-
-# Connecting
-
-## Prerequisites
-
-In order to connect, you need to obtain a **Peer Token** to authorize the peer in your room.
-You can get the token using the [Sandbox API](../../how-to/features/sandbox-api-testing) if you're using the Sandbox environment, or implement your own backend service that will provide the user with a **Peer Token**.
-
-## Connecting
-
-Use the [`useConnection`](../../api/web/functions/useConnection) hook to get
-the [`joinRoom`](../../api/web/functions/useConnection#joinroom) function.
-
-```tsx
-const PEER_TOKEN = "some-peer-token";
-// ---cut-before---
-import { useConnection, useSandbox } from "@fishjam-cloud/react-client";
-import React, { useCallback } from "react";
-
-export function JoinRoomButton() {
- const { joinRoom } = useConnection(); // [!code highlight]
- // get the peer token from sandbox or your backend
- const { getSandboxPeerToken } = useSandbox();
-
- const onJoinRoomPress = useCallback(async () => {
- // [!code highlight:5]
- const peerToken = await getSandboxPeerToken("Room", "User");
- await joinRoom({ peerToken });
- }, [joinRoom]);
-
- return ;
-}
-```
-
-## Disconnecting
-
-In order to close connection, use the [`leaveRoom`](../../api/web/functions/useConnection#leaveroom) method
-from [`useConnection`](../../api/web/functions/useConnection) hook.
-
-```tsx
-import { useConnection } from "@fishjam-cloud/react-client";
-import React, { useCallback } from "react";
-
-export function LeaveRoomButton() {
- const { leaveRoom } = useConnection(); // [!code highlight]
-
- return ;
-}
-```
diff --git a/versioned_docs/version-0.22.0/how-to/react/installation.mdx b/versioned_docs/version-0.22.0/how-to/react/installation.mdx
deleted file mode 100644
index ce903bb7..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/installation.mdx
+++ /dev/null
@@ -1,49 +0,0 @@
----
-sidebar_position: 1
----
-
-import Tabs from "@theme/Tabs";
-import TabItem from "@theme/TabItem";
-
-# Installation
-
-## 1. Install the package
-
-```bash npm2yarn
-npm install @fishjam-cloud/react-client
-```
-
-## 2. Setup Fishjam context
-
-Wrap your app in our [`FishjamProvider`](../../api/web/functions/FishjamProvider) component. Get your Fishjam ID from [Fishjam Dashboard](https://fishjam.io/app) and pass it to the provider.
-
-```tsx
-const App = () => {
- return
Hello world
;
-};
-
-// ---cut---
-import React from "react";
-import ReactDOM from "react-dom/client";
-// import App from "./App";
-import { FishjamProvider } from "@fishjam-cloud/react-client";
-
-// Check https://fishjam.io/app/ for your Fishjam ID
-const FISHJAM_ID = "your-fishjam-id";
-
-ReactDOM.createRoot(document.getElementById("root")!).render(
- // [!code highlight:5]
-
-
-
-
- ,
-);
-```
-
-:::tip
-
-It's possible to have many independent Fishjam contexts in one app.
-Just render many [`FishjamProvider`](../../api/web/functions/FishjamProvider) components and make sure they don't overlap.
-
-:::
diff --git a/versioned_docs/version-0.22.0/how-to/react/list-other-peers.mdx b/versioned_docs/version-0.22.0/how-to/react/list-other-peers.mdx
deleted file mode 100644
index ad5536df..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/list-other-peers.mdx
+++ /dev/null
@@ -1,38 +0,0 @@
----
-sidebar_position: 5
----
-
-# Display media of other peers
-
-To access data and media of other peers, use the [`usePeers`](../../api/web/functions/usePeers) hook.
-It returns two properties, [`remotePeers`](../../api/web/functions/usePeers) and [`localPeer`](../../api/web/functions/usePeers).
-They contain all the tracks of other peers and all the tracks of the local user, respectively.
-
-### Example of playing other peers' available media
-
-```tsx
-import React, { FC } from "react";
-
-const VideoRenderer: FC<{ stream?: MediaStream | null }> = (_) => ;
-
-const AudioPlayer: FC<{ stream?: MediaStream | null }> = (_) => ;
-
-// ---cut---
-import { usePeers } from "@fishjam-cloud/react-client";
-
-export function Component() {
- const { remotePeers } = usePeers();
-
- return (
-
- // remember to import
- your VideoRenderer component
-
-
- ))}
-
- );
-}
-```
diff --git a/versioned_docs/version-0.22.0/how-to/react/managing-devices.mdx b/versioned_docs/version-0.22.0/how-to/react/managing-devices.mdx
deleted file mode 100644
index 38e0a6aa..00000000
--- a/versioned_docs/version-0.22.0/how-to/react/managing-devices.mdx
+++ /dev/null
@@ -1,90 +0,0 @@
----
-sidebar_position: 4
----
-
-# Managing devices
-
-The Fishjam SDK provides functions for dynamically controlling media device streams. This includes selecting desired cameras and microphones, turning them on and off, as well as muting and unmuting microphones.
-
-### Selecting Camera and Microphone - [`selectCamera()`](../../api/web/functions/useCamera#selectcamera) and [`selectMicrophone()`](../../api/web/functions/useMicrophone#selectmicrophone)
-
-To select the desired camera or microphone, use the [`selectCamera()`](../../api/web/functions/useCamera#selectcamera) and [`selectMicrophone()`](../../api/web/functions/useMicrophone#selectmicrophone) functions.
-Lists of the available devices are available via the [`cameraDevices`](../../api/web/functions/useCamera#cameradevices) and [`microphoneDevices`](../../api/web/functions/useMicrophone#microphonedevices) properties.
-
-#### Usage Example
-
-```tsx
-import React from "react";
-import { useCamera } from "@fishjam-cloud/react-client";
-
-export function CameraControl() {
- const { cameraDevices, selectCamera } = useCamera();
-
- return (
-
- {cameraDevices.map(({ deviceId, label }) => (
-
-
-
- ))}
-
- );
-}
-```
-
-### Turning Camera On and Off - [`toggleCamera()`](../../api/web/functions/useCamera#togglecamera)
-
-This function controls the physical operational state of the camera.
-
-- **Turning the camera off**: This action stops the camera device, disables the media stream, and pauses streaming. The webcam indicator light will shut down.
-- **Turning the camera on**: This action starts the camera and resumes streaming, allowing other participants to see video after a brief initialization period.
-
-#### Usage Example
-
-```tsx
-import React from "react";
-import { useCamera } from "@fishjam-cloud/react-client";
-
-export function CameraControl() {
- const { toggleCamera } = useCamera();
-
- return ;
-}
-```
-
-### Turning Microphone On and Off - [`toggleMicrophone()`](../../api/web/functions/useMicrophone#togglemicrophone)
-
-This function toggles the microphone's physical operational state. The function interacts with a physical device, so it might take a noticeable amount of time.
-
-- **Turning the microphone off**: Turns the microphone off, disables the media stream, and pauses any audio transmission.
-- **Turning the microphone on**: Turns the microphone on and resumes audio streaming.
-
-### Muting and Unmuting Microphone - [`toggleMicrophoneMute()`](../../api/web/functions/useMicrophone#togglemicrophonemute)
-
-This function manages the audio stream's operational status without affecting the microphone's hardware state.
-Muting and unmuting is faster, but a muted device still uses resources. This is useful, as it is common to mute and unmute during a meeting. Unmuting needs to be quick to capture the first word of a sentence.
-
-- **Muting the microphone**: This action disables the media stream and stops audio transmission while keeping the microphone active.
-- **Unmuting the microphone**: This action enables the media stream, allowing immediate transmission of sounds.
-
-#### Usage Example
-
-```tsx
-import React from "react";
-import { useMicrophone } from "@fishjam-cloud/react-client";
-
-export function MicrophoneControl() {
- const { toggleMicrophone, toggleMicrophoneMute } = useMicrophone();
-
- return (
-