# Local File: .\eula.md --- title: Software Development Kit License Agreement description: Detailed legal terms for SDK usage, distribution rights, and licensing conditions. Covers developer workstation limits, commercial application deployment, intellectual property protection, and technical support terms. sidebar_label: End User License Agreement route: /docs/eula/ --- # END USER LICENSE AGREEMENT **VISIOFORGE SOFTWARE DEVELOPMENT KITS AND RELATED PRODUCTS** **IMPORTANT – READ CAREFULLY BEFORE INSTALLING OR USING THIS SOFTWARE** ## 1. PREAMBLE This End User License Agreement ("Agreement" or "EULA") constitutes a legally binding contract between you (either an individual or a single entity) and VisioForge ("Licensor"), regarding your use of VisioForge's proprietary software development kits and related documentation (collectively, the "Software"). By installing, copying, downloading, accessing, or otherwise using the Software, you agree to be bound by the terms of this Agreement. If you do not agree to the terms of this Agreement, do not install, access, or use the Software. ## 2. LICENSED PRODUCTS This Agreement applies to all software development kits and related products developed and distributed by VisioForge, including but not limited to: - Video Capture SDK .Net - Media Player SDK .Net - Video Edit SDK .Net - Video Edit SDK FFMPEG .Net - Media Blocks SDK .Net - All-in-One Media Framework (Delphi/ActiveX) - Virtual Camera SDK - FFMPEG Source DirectShow filter - VLC Source DirectShow filter - Encoding Filters Pack - Processing Filters Pack - Video Encryption SDK - Video Fingerprinting SDK ## 3. LICENSE GRANT Subject to the terms and conditions of this Agreement and upon payment of the applicable license fees, Licensor grants you a limited, non-exclusive, non-transferable license to use the Software as follows: ### 3.1. Developer License Rights **3.1.1. One-Year Developer License** - Permits installation and use of the Software on up to three (3) developer workstations by a single developer - Valid for one calendar year from date of purchase - Includes access to all updates and technical support during the license period - After expiration, you may continue using the latest version available during your license period, but without updates or support - License may be renewed at any time - License is not transferable to another company but may be reassigned to another developer within the same company **3.1.2. Lifetime/Team License** - Permits installation and use of the Software on unlimited developer workstations at a single physical location - Valid in perpetuity without renewal requirements - Includes updates and technical support for the first year after purchase - Optional support and update subscription available after the first year - License is not transferable to another company ### 3.2. Distribution Rights - You may incorporate the Software into your own commercial applications and distribute such applications without royalty payments - End users of your applications are not required to purchase separate licenses - Distribution rights apply to both the One-Year Developer License and the Lifetime/Team License ### 3.3. Evaluation License - You may evaluate the Software for a period of thirty (30) calendar days - During the evaluation period, you may use the Software solely for evaluation and testing purposes - You may not use the evaluation version of the Software to develop commercial applications or products - After the evaluation period, you must either purchase a license or discontinue use of the Software ## 4. LICENSE RESTRICTIONS Except as expressly permitted in this Agreement, you may not: **4.1.** Modify, translate, reverse engineer, decompile, disassemble, or create derivative works based on the Software **4.2.** Copy the Software except as expressly permitted in this Agreement **4.3.** Rent, lease, loan, sell, sublicense, distribute, transfer, publish, or make available the Software **4.4.** Remove, alter, or obscure any proprietary notices on the Software **4.5.** Use the Software to develop applications that compete directly with the Software **4.6.** Transfer your license rights to any third party **4.7.** Distribute the source code or components of the Software independently **4.8.** Use the Software in any manner that violates applicable laws or regulations ## 5. OWNERSHIP AND INTELLECTUAL PROPERTY **5.1.** The Software is licensed, not sold. This Agreement only gives you limited rights to use the Software. Licensor reserves all rights not expressly granted to you in this Agreement. **5.2.** The Software is protected by copyright laws and international copyright treaties, as well as other intellectual property laws and treaties. All title, ownership rights, and intellectual property rights in and to the Software shall remain with Licensor. **5.3.** You acknowledge that no title to the intellectual property in the Software is transferred to you. You further acknowledge that title and full ownership rights to the Software will remain the exclusive property of Licensor and you will not acquire any rights to the Software except as expressly set forth in this Agreement. ## 6. TECHNICAL SUPPORT AND UPDATES **6.1.** Technical support is provided to licensed users as specified in the license type purchased. **6.2.** Minor version updates (e.g., from version 1.1 to 1.2) are provided free of charge to all licensed users during their license period. **6.3.** Major version upgrades (e.g., from version 1.x to 2.0) may require additional payment, though at a reduced price for existing customers. **6.4.** Licensor has no obligation to provide support for evaluation versions of the Software. ## 7. TERMINATION **7.1.** Without prejudice to any other rights, Licensor may terminate this Agreement if you fail to comply with the terms and conditions of this Agreement. **7.2.** Upon termination: - Your license rights under this Agreement will terminate - You must cease all use of the Software - You must destroy all copies, full or partial, of the Software - You must, upon request, provide Licensor with written certification of such destruction ## 8. WARRANTIES AND DISCLAIMER **8.1.** THE SOFTWARE IS PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, OR NONINFRINGEMENT. **8.2.** LICENSOR DOES NOT WARRANT THAT THE SOFTWARE WILL MEET YOUR REQUIREMENTS OR THAT THE OPERATION OF THE SOFTWARE WILL BE UNINTERRUPTED OR ERROR-FREE. **8.3.** THE ENTIRE RISK ARISING OUT OF THE USE OR PERFORMANCE OF THE SOFTWARE REMAINS WITH YOU. ## 9. LIMITATION OF LIABILITY **9.1.** IN NO EVENT SHALL LICENSOR OR ITS SUPPLIERS BE LIABLE FOR ANY SPECIAL, INCIDENTAL, INDIRECT, OR CONSEQUENTIAL DAMAGES WHATSOEVER (INCLUDING, WITHOUT LIMITATION, DAMAGES FOR LOSS OF BUSINESS PROFITS, BUSINESS INTERRUPTION, LOSS OF BUSINESS INFORMATION, OR ANY OTHER PECUNIARY LOSS) ARISING OUT OF THE USE OF OR INABILITY TO USE THE SOFTWARE, EVEN IF LICENSOR HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. **9.2.** IN ANY CASE, LICENSOR'S ENTIRE LIABILITY UNDER ANY PROVISION OF THIS AGREEMENT SHALL BE LIMITED TO THE AMOUNT ACTUALLY PAID BY YOU FOR THE SOFTWARE. ## 10. EXPORT REGULATIONS The Software may be subject to export or import regulations. You agree to comply with all international and national laws that apply to the Software. ## 11. GOVERNING LAW AND JURISDICTION This Agreement shall be governed by and construed in accordance with the laws of the jurisdiction in which Licensor is located, without giving effect to any principles of conflicts of law. You agree that any legal action arising out of or relating to this Agreement shall be filed exclusively in the competent courts of the jurisdiction in which Licensor is located. ## 12. SEVERABILITY If any provision of this Agreement is held to be unenforceable or invalid, such provision shall be reformed only to the extent necessary to make it enforceable or valid, and the remaining provisions of this Agreement shall not be affected. ## 13. ENTIRE AGREEMENT This Agreement constitutes the entire agreement between you and Licensor regarding the subject matter hereof and supersedes all prior or contemporaneous understandings regarding such subject matter. No amendment to or modification of this Agreement will be binding unless in writing and signed by Licensor. ## 14. CONTACT INFORMATION If you have any questions about this Agreement, please contact Licensor at the contact information provided on the official VisioForge website. --- © VisioForge. All rights reserved. Last Updated: 2025-04-27 ---END OF PAGE--- # Local File: .\index.md --- title: VisioForge SDK Documentation Hub description: Help and tutorlas for all VisioForge products sidebar_label: Main --- # VisioForge SDK Documentation Hub :::sample Documentation, samples and tutorials ::: --- ## Platform Support | ![Crossplatform|100x100](/static/crossplatform.svg "Crossplatform") | ![.NET|100x100](/static/dotnet.svg ".NET") | ![GPU|100x100](/static/gpu.svg "GPU acceleration") | |:----------------------------------------------------------:|:----------------------------------------:|:---------------------------------------------------:| | **Crossplatform** | **.NET Versions** | **GPU acceleration** | | Windows, macOS, iOS, Android, and Linux (including Nvidia Jetson) are supported. | VisioForge SDKs are available for .Net Framework 4.6.1, .Net Core 3.1, .Net 5, and later. | Hardware GPU-accelerated video encoding, decoding, and processing are available. | Welcome to the VisioForge SDK Documentation Hub, your comprehensive resource for mastering our professional .NET video processing solutions. Our SDK suite empowers developers to build robust applications with advanced video capture, playback, editing, and processing capabilities. ## Our SDK Product Line ### [Video Capture SDK .NET](https://www.visioforge.com/video-capture-sdk-net) The Video Capture SDK enables developers to implement high-performance video capture functionality from various sources: - Local webcams and video devices - IP cameras (RTSP, ONVIF, MJPEG) - Blackmagic Design DeckLink devices - Screen capture and desktop recording The SDK supports multiple video formats, resolutions, and compression options, allowing for flexible implementation in surveillance systems, video conferencing applications, broadcasting software, and more. ### [Media Player SDK .NET](https://www.visioforge.com/media-player-sdk-net) Our Media Player SDK provides a comprehensive solution for video playback in .NET applications: - Support for a wide range of video formats (MP4, AVI, MOV, MKV, etc.) - Hardware-accelerated decoding for optimal performance - Advanced playback controls (speed, looping, seeking) - Frame-by-frame navigation - Audio visualization and control - Custom overlay capabilities This SDK is ideal for creating media players, video analysis tools, and content management applications. ### [Video Edit SDK .NET](https://www.visioforge.com/video-edit-sdk-net) The Video Edit SDK offers powerful video editing and processing capabilities: - Video trimming, splitting, and concatenation - Overlay text, images, and watermarks - Apply visual effects and filters - Audio mixing and editing - Format conversion and transcoding - Frame extraction and manipulation Developers can use this SDK to build video editing applications, format converters, content creation tools, and automated video processing systems. ### [Media Blocks SDK .NET](https://www.visioforge.com/media-blocks-sdk-net) Media Blocks SDK is our advanced solution for building complex media processing workflows: - Component-based architecture for flexible pipeline creation - Real-time video and audio processing - Streaming capability (RTMP, HLS, MPEG-DASH) - Advanced encoding and decoding options - Integration with other VisioForge SDKs This SDK is designed for complex applications requiring sophisticated media processing chains, such as broadcast systems, streaming platforms, and media analytics solutions. ## Getting Started Each SDK includes comprehensive documentation to help you get started quickly: 1. **Installation Guide**: Step-by-step instructions for adding our SDK to your project 2. **Quick Start Tutorials**: Basic implementation examples to get your first application running 3. **API Reference**: Detailed documentation of all classes, methods, and properties 4. **Code Samples**: Practical examples demonstrating key features and common use cases 5. **Troubleshooting**: Solutions to common issues and optimization tips ## System Requirements Our SDKs are designed for .NET environments: - .NET Framework 4.5.2 or higher - .NET Core 3.1 or higher - .NET 5.0+ - Windows 10/11, Windows Server - macOS 10.14 or higher - iOS 15 or higher - Android 8.0 or higher - Linux (Ubuntu, Debian, CentOS, Fedora, Raspbian) - Nvidia Jetson (Linux) ## Technical Support We're committed to your success with our SDKs. If you encounter any issues or have questions: - Contact our [Technical Support Team](https://support.visioforge.com/) for direct assistance - Review our [Sample Projects](https://github.com/visioforge/.Net-SDK-s-samples/) for implementation examples - Visit our [Discord chat](https://discord.gg/yvXUG56WCH) ## Licensing Options We offer flexible licensing options to meet your development needs: - Developer licenses - Royalty-free distribution - Site licenses for enterprise environments - Custom licensing solutions for specific requirements - Free trial versions for evaluation - Lifetime updates and support Visit our [Licensing Page](https://www.visioforge.com/buy) for detailed information and pricing. --- Thank you for choosing VisioForge for your media processing needs. We're dedicated to providing powerful, reliable tools that help you create exceptional video applications. ---END OF PAGE--- # Local File: .\licensing.md --- title: VisioForge SDKs Licensing Options description: Complete guide to VisioForge SDKs licensing options for developers - understand developer licenses, team licensing, deployment rights, and renewal options for all VisioForge video and media products. sidebar_label: Licensing order: 1 --- # VisioForge SDKs Licensing Guide ## Introduction to Our Licensing Model Our licensing model is designed with developers in mind, offering a straightforward and cost-effective approach for integrating powerful video and media capabilities into your applications. Each license provides commercial usage rights with no hidden costs or complicated terms, ensuring you can focus on building great software rather than navigating complex licensing structures. For complete legal details, please refer to our [End User License Agreement (EULA)](eula.md). ## Developer-Based Licensing Framework We operate on a per-developer licensing model rather than charging per end-user or per deployment. This approach offers several key advantages: - **Royalty-free deployment**: Once licensed, you can distribute your applications to unlimited end-users without additional fees - **No runtime licensing**: End-users never need to purchase or manage separate licenses - **Predictable costs**: Your development team knows the exact licensing cost upfront with no surprise fees This model is particularly beneficial for commercial applications where you need to distribute your software widely without tracking deployment counts or managing complex licensing servers. ## Understanding License Scope and Requirements ### Team Licensing Requirements Each developer actively working with the SDK requires their own license when using the One-Year license option. This policy ensures fair usage while supporting our ongoing development efforts. Consider the following scenarios: - A team of three developers working with the SDK would require three One-Year licenses - If developers work in shifts (never concurrently), you may be able to share licenses - Temporary or contract developers must also be covered by appropriate licensing - For unlimited team usage, consider the Lifetime/Team license option ### Contractor and Agency Arrangements For contractors and agencies developing on behalf of clients: - Licenses should be purchased in the name of the contracting company - The end client typically retains ownership of the license after project completion - Clear documentation of license ownership should be established in project contracts ## License Types and Duration Options We offer two primary license types to accommodate different development timelines and budget constraints: ### One-Year License The one-year license provides a lower entry cost with comprehensive benefits: - Valid for 12 months from purchase date - Full access to all product updates released during the license period - Complete technical support through our developer portal and support channels - Documentation and sample code access - License renewal options at the end of the term When your one-year license expires, you may continue using the last version released during your active license period indefinitely. However, to access new features, bug fixes, and technical support, you'll need to renew your license. ### Lifetime/Team License The Lifetime/Team license offers long-term value with perpetual usage rights for your entire development team: - One-time payment with no renewal requirements - Perpetual access to the product version purchased - All updates released within the first 12 months included at no additional cost - Technical support for the first 12 months - Optional update and support subscription available after the first year - Coverage for your entire development team without per-developer restrictions The Lifetime/Team license is ideal for projects with longer development cycles, larger development teams, or when you want to eliminate recurring license costs from your budget. ## Support and Updates Policy ### Support Channels and Availability During your active license period, you'll have access to: - Email technical support with guaranteed response times - Developer forum access - Priority bug fixes for critical issues - Implementation guidance and code review assistance Our support team consists of the same engineers who build the SDK, ensuring you receive expert assistance from professionals who understand the product at its deepest level. ### Update Release Schedule and Access We maintain a regular update schedule to improve the SDK: - Major updates approximately every 6 months - Minor updates and bug fixes released as needed, typically monthly - Security patches prioritized and released as soon as available - All updates documented with detailed release notes Customers with active licenses or update subscriptions can download updates directly from our developer portal immediately upon release. ## License Administration ### License Activation and Management Managing your licenses is simple through our online portal: 1. Purchase the appropriate number of licenses 2. Receive activation keys via email 3. Download the SDK from our developer portal 4. Apply the license key during installation or in your code 5. Track and manage licenses through your account dashboard ### License Transfers and Reassignment Licenses can be reassigned within your organization as development needs change: - Developer transitions can be handled by reassigning licenses - Company acquisitions or mergers may qualify for license transfers (contact support) - License keys are tied to your company account, not individual developers ## Deployment Rights and Redistribution ### Application Distribution Your licensing rights include: - Distribution of compiled applications to unlimited end-users - Deployment across multiple environments (development, testing, production) - Cloud, on-premises, and hybrid deployment options - No additional runtime licensing requirements ### Source Code and Intellectual Property Protection While you can distribute applications built with our SDK freely, certain restrictions apply: - The SDK source code and components cannot be redistributed independently - Your license does not grant rights to repackage or resell the SDK itself - Your application's source code and intellectual property remain entirely yours ## Getting Started with Your License Once you've acquired your license, you can immediately begin development: 1. Install the SDK using your license key 2. Access comprehensive documentation through our developer portal 3. Review sample applications for implementation guidance 4. Join our developer community for additional support ## Frequently Asked Questions ### Can I use one license for multiple projects? Yes, a single developer license covers all projects by that developer using our SDK. ### What happens if my one-year license expires? You can continue using the last version released during your active license period, but will lose access to updates and support until renewal. ### Do I need licenses for build servers or CI/CD pipelines? No, build environment installations don't require separate licenses. ### How do license renewals work? Renewal notices are sent 30 days before expiration, and renewing maintains continuity of updates and support. --- For more implementation examples and code samples, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). For specific licensing questions not addressed here, please contact our sales team. ---END OF PAGE--- # Local File: .\README.md # Website Content Crawler This Python script crawls the VisioForge website (www.visioforge.com) and extracts text content from all pages, excluding those in the `/help/` directory. The extracted content is converted to markdown format and saved to a single text file for analysis or training purposes. ## Features - **Web Crawling**: Automatically discovers and visits all pages on the website - **Content Filtering**: Excludes pages in the `/help/` directory as requested - **HTML to Markdown Conversion**: Converts HTML content to clean markdown format for better readability - **Duplicate URL Handling**: Normalizes URLs by removing fragments (#anchors) to prevent duplicate crawling - **Cross-Platform**: Works on Windows, macOS, and Linux - **Respectful Crawling**: Includes delays between requests and proper User-Agent headers - **Error Handling**: Gracefully handles connection errors and continues crawling ## Requirements Install the required dependencies: ```bash pip install -r requirements.txt ``` ## Dependencies - `requests`: For making HTTP requests - `beautifulsoup4`: For HTML parsing and text extraction - `lxml`: For efficient XML/HTML parsing - `markdownify`: For converting HTML content to markdown format ## Usage Simply run the script: ```bash python llms-parser.py ``` The script will: 1. Start crawling from http://www.visioforge.com 2. Extract HTML content from each page 3. Convert HTML to markdown format 4. Follow links to discover new pages 5. Save all content to `llms-full.txt` ## Output The script creates a file called `llms-full.txt` containing: - Page URL as markdown heading - Converted markdown content - Page separators (`---END OF PAGE---`) ## Configuration You can modify the script to: - Change the target website by updating the `base_url` variable - Adjust the crawling limit by changing the `page_count < 1000` condition - Modify the delay between requests by changing the `time.sleep(1)` value - Add additional URL filtering in the `is_valid_url()` function - Customize markdown conversion options in the `extract_and_convert_to_markdown()` function - Adjust URL normalization behavior in the `normalize_url()` function ## Example Output Format ```markdown # Page: https://www.visioforge.com/video-capture-sdk-net # Video Capture SDK dot Net/c# - record videos from camera Our .Net SDK seamlessly integrates video capture... ## Key Features - USB web cameras and other capture devices - ONVIF IP cameras (PTZ and other APIs supported) - JPEG/MJPEG, MPEG-4 and H.264 HTTP/RTSP/RTMP IP cameras ---END OF PAGE--- # Page: https://www.visioforge.com/media-player-sdk-net # Media Player SDK .Net - video playback solution The Media Player SDK .Net enables developers to seamlessly integrate... ---END OF PAGE--- ``` ## Notes - The script respects robots.txt conventions by including proper headers - It's currently set to crawl up to 1000 pages (increased from the initial 10-page test limit) - Network errors are handled gracefully with continue statements - Only pages from the same domain are crawled for security - HTML content is converted to clean markdown format for better readability - URLs with fragments (#anchors) are normalized to prevent duplicate crawling of the same page ---END OF PAGE--- # Local File: .\codebase\CustomSimpleCaptureDemo\readme.es.md # Media Blocks SDK .Net - Simple Capture Demo (WPF) Este ejemplo del SDK muestra cómo crear una sencilla aplicación de captura de vídeo utilizando el SDK VisioForge Media Blocks .Net en un entorno WPF. La aplicación inicializa un canal de medios para capturar vídeo y audio desde dispositivos del sistema, renderizarlos en tiempo real y, opcionalmente, codificar y guardar el resultado en un archivo MP4. Muestra la enumeración de dispositivos, la selección de fuentes de vídeo y audio, la renderización de vídeo y audio en tiempo real y las capacidades de salida de archivos. La muestra incluye la gestión de errores y elementos de interfaz de usuario para la selección de dispositivos y formatos, demostrando un enfoque integrado para la captura y procesamiento de medios con la tecnología de VisioForge. ## Características - Captura de vídeo desde cámaras web a archivos MP4 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\CustomSimpleCaptureDemo\readme.md # Media Blocks SDK .Net - Simple Capture Demo (WPF) This SDK sample demonstrates how to build a simple video capture application using the VisioForge Media Blocks SDK .Net in a WPF environment. The application initializes a media pipeline for capturing video and audio from system devices, rendering them in real-time, and optionally encoding and saving the output to an MP4 file. It showcases device enumeration, video and audio source selection, real-time video and audio rendering, and file output capabilities. The sample includes error handling and UI elements for device and format selection, demonstrating an integrated approach to media capture and processing with VisioForge's technology. ## Features - Capture video from webcams to MP4 file - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\readme.md # VisioForge.DotNet.Core information The Core package contains the core classes and methods for all VisioForge SDKs. Using this package you can record audio and video, play audio and video, edit video, and process using various video and audio effects. ## Samples Sample applications are available on [GitHub](https://github.com/visioforge/.Net-SDK-s-samples). ## Deployment Several SDKs are available for deployment inside this package. Please check the [Help page](https://www.visioforge.com/help/docs/dotnet/) for more information. ## More information - [Video Capture SDK .Net](https://www.visioforge.com/video-capture-sdk-net) - [Media Player SDK .Net](https://www.visioforge.com/media-player-sdk-net) - [Video Edit SDK .Net](https://www.visioforge.com/video-edit-sdk-net) - [Media Blocks SDK .Net](https://www.visioforge.com/media-blocks-sdk-net) - Read the [Changelog](https://github.com/visioforge/.Net-SDK-s-samples/blob/master/changelog.md) for the latest changes. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositor\index.md --- title: Live Video Compositor description: VisioForge Media Blocks SDK .Net - Live Video Compositor sidebar_label: Live Video Compositor --- # Live Video Compositor Live Video Compositor is a part of the [VisioForge Media Blocks SDK .Net](https://www.visioforge.com/media-blocks-sdk-net) that allows you to add and remove sources and outputs in real-time to a pipeline. This allows you to create applications that simultaneously handle multiple video and audio sources. For example, the LVC allows you to start streaming to YouTube at just the right moment, while simultaneously recording video to disk. Using the LVC you can create an application similar to OBS Studio. Each source and output has its unique identifier that can be used to add and remove sources and outputs in real-time. Each source and output has its independent pipeline that can be started and stopped. ## Features - Supports multiple video and audio sources - Supports multiple video and audio outputs - Setting the position and size of video sources - Setting the transparency of video sources - Setting the volume of audio sources ## Classes ### LiveVideoCompositor The LiveVideoCompositor is the main class that allows adding and removing live sources and outputs to the pipeline. When creating it, it is necessary to specify the resolution and frame rate to use. All sources with a different frame rate will be automatically converted to the frame rate specified when creating the LVC. ### LiveVideoCompositorSettings LiveVideoCompositorSettings allows you to set the video and audio parameters. It is also necessary to set the maximum number of sources and outputs. The [LVCAudioInput](LVCAudioInput.md) and [LVCVideoInput](LVCVideoInput.md) classes are used to add sources. The classes [LVCAudioOutput](LVCAudioOutput.md), [LVCVideoOutput](LVCVideoOutput.md), [LVCVideoAudioOutput](LVCVideoAudioOutput.md) and [LVCVideoViewOutput](LVCVideoViewOutput.md) are used to add outputs. ## Sample code [Sample application on GitHub](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Live%20Video%20Compositor%20Demo) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositor\LVCAudioInput.md --- title: Live Video Compositor | LVCAudioInput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCAudioInput sidebar_label: LVCAudioInput --- # LVC audio input The LVCAudioInput class is used to add audio sources to the LVC pipeline. The class allows you to set the audio parameters and the volume of the audio source. ## Usage When creating an LVCAudioInput object, you must specify the MediaBlock to be used as the audio data source. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositor\LVCAudioOutput.md --- title: Live Video Compositor | LVCAudioOutput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCAudioOutput sidebar_label: LVCAudioOutput --- # LVC audio output The LVCAudioOutput class is used to add audio outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ## Usage When creating an LVCAudioOutput object, you must specify the MediaBlock to be used as the audio data output. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositor\LVCVideoAudioOutput.md --- title: Live Video Compositor | LVCVideoAudioOutput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCVideoAudioOutput sidebar_label: LVCVideoAudioOutput --- # LVC video/audio output The LVCVideoAudioOutput class is used to add video+audio outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ## Usage When creating an LVCVideoAudioOutput object, you must specify the MediaBlock to be used as the video+audio data output. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositor\LVCVideoInput.md --- title: Live Video Compositor | LVCVideoInput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCVideoInput sidebar_label: LVCVideoInput --- # LVC video input The LVCVideoInput class is used to add video sources to the LVC pipeline. The class allows you to set the video parameters and the rectangle of the video source. ## Usage When creating an LVCVideoInput object, you must specify the MediaBlock to be used as the video data source. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositor\LVCVideoOutput.md --- title: Live Video Compositor | LVCVideoOutput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCVideoOutput sidebar_label: LVCVideoOutput --- # LVC video output The LVCVideoOutput class is used to add video outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ## Usage When creating an LVCVideoOutput object, you must specify the MediaBlock to be used as the video data output. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositor\LVCVideoViewOutput.md --- title: Live Video Compositor | LVCVideoViewOutput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCVideoViewOutput sidebar_label: LVCVideoViewOutput --- # LVC video view output The LVCVideoViewOutput class is used to add video view to the LVC pipeline. ## Usage When creating an LVCVideoViewOutput object, you must specify the VideoView to be used. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositorV2\LVCAudioInput.md --- title: Live Video Compositor | LVCAudioInput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCAudioInput sidebar_label: LVCAudioInput --- # LVC audio input The LVCAudioInput class is used to add audio sources to the LVC pipeline. The class allows you to set the audio parameters and the volume of the audio source. ## Usage When creating an LVCAudioInput object, you must specify the MediaBlock to be used as the audio data source. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositorV2\LVCAudioOutput.md --- title: Live Video Compositor | LVCAudioOutput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCAudioOutput sidebar_label: LVCAudioOutput --- # LVC audio output The LVCAudioOutput class is used to add audio outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ## Usage When creating an LVCAudioOutput object, you must specify the MediaBlock to be used as the audio data output. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositorV2\LVCVideoAudioOutput.md --- title: Live Video Compositor | LVCVideoAudioOutput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCVideoAudioOutput sidebar_label: LVCVideoAudioOutput --- # LVC video/audio output The LVCVideoAudioOutput class is used to add video+audio outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ## Usage When creating an LVCVideoAudioOutput object, you must specify the MediaBlock to be used as the video+audio data output. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositorV2\LVCVideoInput.md --- title: Live Video Compositor | LVCVideoInput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCVideoInput sidebar_label: LVCVideoInput --- # LVC video input The LVCVideoInput class is used to add video sources to the LVC pipeline. The class allows you to set the video parameters and the rectangle of the video source. ## Usage When creating an LVCVideoInput object, you must specify the MediaBlock to be used as the video data source. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositorV2\LVCVideoOutput.md --- title: Live Video Compositor | LVCVideoOutput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCVideoOutput sidebar_label: LVCVideoOutput --- # LVC video output The LVCVideoOutput class is used to add video outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ## Usage When creating an LVCVideoOutput object, you must specify the MediaBlock to be used as the video data output. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\LiveVideoCompositorV2\LVCVideoViewOutput.md --- title: Live Video Compositor | LVCVideoViewOutput description: VisioForge Media Blocks SDK .Net - Live Video Compositor | LVCVideoViewOutput sidebar_label: LVCVideoViewOutput --- # LVC video view output The LVCVideoViewOutput class is used to add video view to the LVC pipeline. ## Usage When creating an LVCVideoViewOutput object, you must specify the VideoView to be used. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\AACEncoderBlock.md --- title: AAC encoder block description: VisioForge Media Blocks SDK .Net - AAC encoder sidebar_label: AAC encoder --- # AAC encoder `AAC (Advanced Audio Coding)`: A lossy compression format known for its efficiency and superior sound quality compared to MP3, widely used in digital music and broadcasting. AAC encoder is used for encoding files in MP4, MKV, M4A and some other formats, as well as for network streaming using RTSP and HLS. Use the `AACEncoderSettings` class to set the parameters. ## Block info Name: AACEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | AAC | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AACEncoderBlock; AACEncoderBlock-->MP4SinkBlock; ``` ## Sample code ```cs var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var aacEncoderBlock = new AACEncoderBlock(new MFAACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, aacEncoderBlock.Input); var m4aSinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.m4a")); pipeline.Connect(aacEncoderBlock.Output, m4aSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ### Sample applications - [Audio Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Audio%20Capture%20Demo) - [Simple Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) - [Screen Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Screen%20Capture) ### Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\ADPCMEncoderBlock.md --- title: ADPCM encoder block description: VisioForge Media Blocks SDK .Net - ADPCM encoder sidebar_label: ADPCM encoder --- # ADPCM encoder `ADPCM (Adaptive Differential Pulse Code Modulation)`: A method of encoding audio that reduces data size by predicting subsequent samples, widely used in video game audio and telephony. ADPCM encoder is used to encode an audio stream in ADPCM format. ## Block info Name: ADPCMEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM | 1 Output | audio/x-adpcm | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ADPCMEncoderBlock; ADPCMEncoderBlock-->WAVSinkBlock; ``` ## Sample code ```cs var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var adpcmEncoderBlock = new ADPCMEncoderBlock(); pipeline.Connect(fileSource.AudioOutput, adpcmEncoderBlock.Input); var wavSinkBlock = new WAVSinkBlock(@"output.wav"); pipeline.Connect(adpcmEncoderBlock.Output, wavSinkBlock.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\ALAWEncoderBlock.md --- title: ALAW/G.711 encoder block description: VisioForge Media Blocks SDK .Net - ALAW/G.711 encoder sidebar_label: ALAW/G.711 encoder --- # ALAW/G.711 encoder `ALAW`: A companding algorithm used primarily in voice communications, notably in telephone systems, that compresses audio data to reduce bandwidth usage while maintaining sound integrity. ALAW/G.711 encoder is used to encode audio streams in ALAW format. ## Block info Name: ALAWEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM | 1 Output | audio/x-alaw | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ALAWEncoderBlock; ALAWEncoderBlock-->WAVSinkBlock; ``` ## Sample code ```cs var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var alawEncoderBlock = new ALAWEncoderBlock(); pipeline.Connect(fileSource.AudioOutput, alawEncoderBlock.Input); var wavSinkBlock = new WAVSinkBlock(@"output.wav"); pipeline.Connect(alawEncoderBlock.Output, wavSinkBlock.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\FLACEncoderBlock.md --- title: FLAC encoder block description: VisioForge Media Blocks SDK .Net - FLAC encoder sidebar_label: FLAC encoder --- # FLAC encoder `FLAC (Free Lossless Audio Codec)`: An audio format that compresses without any loss of quality, ensuring perfect sound reproduction, commonly used by audiophiles. FLAC Encoder is used to encode files into FLAC, WebM, MKV and some other formats. Use the `FLACEncoderSettings` class to set the parameters. ## Block info Name: FLACEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | FLAC | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->FLACEncoderBlock; FLACEncoderBlock-->FileSinkBlock; ``` ## Sample code ```cs var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var flacEncoderBlock = new FLACEncoderBlock(new FLACEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, flacEncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.flac"); pipeline.Connect(flacEncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\index.md --- title: Audio encoder blocks description: VisioForge Media Blocks SDK .Net - Audio encoder blocks sidebar_label: Audio encoders --- # Audio encoders blocks Audio encoding is the process of converting raw audio data into a compressed format. This process is essential for reducing the size of audio files, making them easier to store and stream over the internet. VisioForge Media Blocks SDK provides a wide range of audio encoders that support various formats and codecs. - [AAC encoder](AACEncoderBlock.md) - [ADPCM encoder](ADPCMEncoderBlock.md) - [ALAW encoder](ALAWEncoderBlock.md) - [FLAC encoder](FLACEncoderBlock.md) - [MP3 encoder](MP3EncoderBlock.md) - [OPUS encoder](OPUSEncoderBlock.md) - [Speex encoder](SpeexEncoderBlock.md) - [Vorbis encoder](VorbisEncoderBlock.md) - [WAV (PCM) encoder](WAVEncoderBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\MP2EncoderBlock.md --- title: MP2 encoder block description: VisioForge Media Blocks SDK .Net - MP2 encoder sidebar_label: MP2 encoder --- # MP2 encoder `MP2 (MPEG-1 Audio Layer II)`: An older, lossy audio compression format that is less complex than MP3, commonly used in broadcasting and digital radio due to its robustness. An MP2 encoder can be used to encode audio streams in formats like MPEG-TS or VOB. Use the `MP2EncoderSettings` class to set the parameters. ## Block info Name: MP2EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | S16LE | 1 Output | audio/mpeg | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP2EncoderBlock; UniversalSourceBlock-->MPEG2EncoderBlock; MP2EncoderBlock-->MPEGTSSinkBlock; MPEG2EncoderBlock-->MPEGTSSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new MP2EncoderBlock(new MP2EncoderSettings()); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var tsSinkBlock = new MPEGTSSinkBlock(new MPEGTSSinkSettings(@"output.ts")); pipeline.Connect(h264EncoderBlock.Output, tsSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(audioEncoderBlock.Output, tsSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\MP3EncoderBlock.md --- title: MP3 encoder block description: VisioForge Media Blocks SDK .Net - MP3 encoder sidebar_label: MP3 encoder --- # MP3 encoder `MP3 (MPEG Audio Layer III)`: A popular lossy audio format that revolutionized digital music distribution by compressing files while retaining a reasonable sound quality. An MP3 encoder can convert audio stream into MP3 files or embed MP3 audio streams in formats like AVI, MKV, and others. Use the `MP3EncoderSettings` class to set the parameters. ## Block info Name: MP3EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | audio/mpeg | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP3EncoderBlock; MP3EncoderBlock-->FileSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mp3EncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, mp3EncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.mp3"); pipeline.Connect(mp3EncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\OPUSEncoderBlock.md --- title: OPUS encoder block description: VisioForge Media Blocks SDK .Net - OPUS encoder sidebar_label: OPUS encoder --- # OPUS encoder `OPUS`: A versatile audio codec optimized for both music and speech, known for its low latency and adaptability to different network environments, ideal for real-time applications. OPUS encoder is used when encoding files in OGG, and some other formats. Use the `OPUSEncoderSettings` class to set the parameters. ## Block info Name: OPUSEncoderSettings. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | audio/x-opus | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->OPUSEncoderBlock; OPUSEncoderBlock-->OGGSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var opusEncoderBlock = new OPUSEncoderBlock(new OPUSEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, opusEncoderBlock.Input); var oggSinkBlock = new OGGSinkBlock(new OGGSinkSettings(@"output.ogg")); pipeline.Connect(opusEncoderBlock.Output, oggSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\SpeexEncoderBlock.md --- title: Speex encoder block description: VisioForge Media Blocks SDK .Net - Speex encoder sidebar_label: Speex encoder --- # Speex encoder `Speex`: An open-source, patent-free audio codec designed specifically for compressing voice at low bit rates, commonly used in VoIP and audio streaming applications. Speex encoder is used to encode files in OGG, WebM, MKV, and some other formats. Use the `SpeexEncoderSettings` class to set the parameters. ## Block info Name: SpeexEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | audio/x-speex | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SpeexEncoderBlock; SpeexEncoderBlock-->OGGSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var speexEncoderBlock = new SpeexEncoderBlock(new SpeexEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, speexEncoderBlock.Input); var oggSinkBlock = new OGGSinkBlock(new OGGSinkSettings(@"output.ogg")); pipeline.Connect(speexEncoderBlock.Output, oggSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\VorbisEncoderBlock.md --- title: Vorbis encoder block description: VisioForge Media Blocks SDK .Net - Vorbis encoder sidebar_label: Vorbis encoder --- # Vorbis encoder `Vorbis`: An open-source and royalty-free audio compression technology, part of the Ogg multimedia project, known for providing better sound quality than MP3 at lower bit rates. Vorbis encoder is used to encode files in OGG, WebM, MKV, and some other formats. Use the `VorbisEncoderSettings` class to set the parameters. ## Block info Name: VorbisEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | audio/x-vorbis | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VorbisEncoderBlock; VorbisEncoderBlock-->OGGSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vorbisEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, vorbisEncoderBlock.Input); var oggSinkBlock = new OGGSinkBlock(new OGGSinkSettings(@"output.ogg")); pipeline.Connect(vorbisEncoderBlock.Output, oggSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\WAVEncoderBlock.md --- title: WAV encoder block description: VisioForge Media Blocks SDK .Net - WAV encoder sidebar_label: WAV encoder --- # WAV encoder `PCM (Pulse Code Modulation)`: The standard form of digital audio in computers and compact discs, where analog audio signals are converted into digital without compression, preserving high fidelity. WAV encoder block can produce a PCM stream with a specified format. Use the `WAVEncoderSettings` class to set the parameters. ## Block info Name: WAVEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM | 1 Output | PCM | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WAVEncoderBlock; WAVEncoderBlock-->WAVSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wavEncoderBlock = new WAVEncoderBlock(new WAVEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, wavEncoderBlock.Input); var wavSinkBlock = new WAVSinkBlock(@"output.wav"); pipeline.Connect(wavEncoderBlock.Output, wavSinkBlock.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioEncoders\WMAEncoderBlock.md --- title: WMA encoder block description: VisioForge Media Blocks SDK .Net - WMA encoder sidebar_label: WMA encoder --- # WMA encoder `WMA (Windows Media Audio)`: A series of audio codecs and their corresponding audio coding formats developed by Microsoft, known for its ability to compress at high ratios with good sound quality. The WMA encoder is used to encode files in ASF, WMA, and WMV formats. Use the `WMAEncoderSettings` class to set the parameters. ## Block info Name: WMAEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | F32LE | 1 Output | WMA (v1) | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WMAEncoderBlock; WMAEncoderBlock-->ASFSinkBlock; ``` ## Sample code ```cs var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new WMAEncoderBlock(new WMAEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var sinkBlock = new ASFSinkBlock(new ASFSinkSettings(@"output.wma")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\AmplifyBlock.md --- title: Amplify effect block description: VisioForge Media Blocks SDK .Net - Amplify effect block sidebar_label: Amplify --- # Amplify Block amplifies an audio stream by an amplification factor. Several clipping modes are available. Use method and level values to configure. ## Block info Name: AmplifyBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AmplifyBlock; AmplifyBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var amplify = new AmplifyBlock(AmplifyClippingMethod.Normal, 2.0); pipeline.Connect(fileSource.AudioOutput, amplify.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(amplify.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\AudioBalanceBlock.md --- title: Balance effect block description: VisioForge Media Blocks SDK .Net - Balance effect block sidebar_label: Balance --- # Balance The Balance block sets the position in the stereo panorama. Use the balance value to configure. ## Block info Name: AudioBalanceBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AudioBalanceBlock; AudioBalanceBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var balance = new AudioBalanceBlock(1.0f); pipeline.Connect(fileSource.AudioOutput, balance.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(balance.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\AudioMixerBlock.md --- title: Audio mixer effect block description: VisioForge Media Blocks SDK .Net - Audio mixer effect block sidebar_label: Audio mixer --- # Audio mixer The audio mixer block mixes multiple audio streams into one. Block mixes the streams regardless of their format, converting if necessary. All input streams will be synchronized. Use the `AudioMixerSettings` class to set the custom output format. ## Block info Name: AudioMixerBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; VirtualAudioSourceBlock#1-->AudioMixerBlock; VirtualAudioSourceBlock#2-->AudioMixerBlock; AudioMixerBlock-->VorbisEncoderBlock; VorbisEncoderBlock-->OGGSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var audioSource1Block = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var audioSource2Block = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var audioMixerBlock = new AudioMixerBlock(new AudioMixerSettings()); pipeline.Connect(audioSource1Block.Output, audioMixerBlock.CreateNewInput()); pipeline.Connect(audioSource2Block.Output, audioMixerBlock.CreateNewInput()); var vorbisEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings()); pipeline.Connect(audioMixerBlock.Output, vorbisEncoderBlock.Input); var oggSinkBlock = new OGGSinkBlock(new OGGSinkSettings(@"output.ogg")); pipeline.Connect(vorbisEncoderBlock.Output, oggSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\AudioSampleGrabberBlock.md --- title: Audio sample grabber block description: VisioForge Media Blocks SDK .Net - Audio sample grabber block sidebar_label: Audio sample grabber --- # Audio sample grabber The audio sample grabber calls an event for each audio frame. You can save or process the received audio frame. ## Block info Name: AudioSampleGrabberBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AudioSampleGrabberBlock; AudioSampleGrabberBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioSG = new AudioSampleGrabberBlock(); audioSG.OnAudioFrameBuffer += AudioSG_OnAudioFrameBuffer; pipeline.Connect(fileSource.AudioOutput, audioSG.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioSG.Output, audioRenderer.Input); await pipeline.StartAsync(); private void AudioSG_OnAudioFrameBuffer(object sender, AudioFrameBufferEventArgs e) { // save or process the audio frame } ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\CompressorExpanderBlock.md --- title: Compressor/Expander effect block description: VisioForge Media Blocks SDK .Net - Compressor/Expander effect block sidebar_label: Compressor/Expander --- # Compressor/Expander This block can work as a compressor or expander. ## Block info Name: CompressorExpanderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->CompressorExpanderBlock; CompressorExpanderBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var compressor = new CompressorExpanderBlock(); pipeline.Connect(fileSource.AudioOutput, compressor.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(compressor.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\EchoBlock.md --- title: Echo effect block description: VisioForge Media Blocks SDK .Net - Echo effect block sidebar_label: Echo --- # Echo The echo block adds an echo to an audio stream. The echo delay, intensity, and percentage of feedback can be configured. Use Delay, Intensity and Feedback parameters to set the settings. ## Block info Name: EchoBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->EchoBlock; EchoBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var echo = new EchoBlock(); pipeline.Connect(fileSource.AudioOutput, echo.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(echo.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\Equalizer10Block.md --- title: Equalizer (10 bands) effect block description: VisioForge Media Blocks SDK .Net - Equalizer (10 bands) effect block sidebar_label: Equalizer (10 bands) --- # Equalizer (10 bands) The 10-band equalizer block allows changing the gain of 10 frequency bands. The bands are equally distributed between 30 Hz and 15 KHz. Use the SetBand method to set a value for each band. ## Block info Name: Equalizer10Block. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->Equalizer10Block; Equalizer10Block-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var eq10 = new Equalizer10Block(); eq10.SetBand(0, -6.0f); eq10.SetBand(1, -6.0f); eq10.SetBand(2, -6.0f); eq10.SetBand(3, -6.0f); eq10.SetBand(4, -6.0f); eq10.SetBand(5, -8.0f); eq10.SetBand(6, -8.0f); eq10.SetBand(7, -8.0f); eq10.SetBand(8, -8.0f); eq10.SetBand(9, -8.0f); pipeline.Connect(fileSource.AudioOutput, eq10.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(eq10.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\EqualizerParametricBlock.md --- title: Equalizer (Parametric) effect block description: VisioForge Media Blocks SDK .Net - Equalizer (Parametric) effect block sidebar_label: Equalizer (Parametric) --- # Equalizer (Parametric) The parametric equalizer block allows selection between 1 and 64 bands. You can change the center frequency, bandwidth and gain for each band. Use the SetNumBands method to set the number of bands. Use the SetState method to set each band state. ## Block info Name: EqualizerParametricBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->EqualizerParametricBlock; EqualizerParametricBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var eq = new EqualizerParametricBlock(); eq.SetNumBands(5); eq.SetState(0, new ParametricEqualizerBand(120.0f, 50.0f, -3.0f)); eq.SetState(1, new ParametricEqualizerBand(500.0f, 20.0f, 12.0f)); eq.SetState(2, new ParametricEqualizerBand(1503.0f, 2.0f, -20.0f)); eq.SetState(3, new ParametricEqualizerBand(6000.0f, 1000.0f, 6.0f)); eq.SetState(4, new ParametricEqualizerBand(3000.0f, 120.0f, 2.0f)); pipeline.Connect(fileSource.AudioOutput, eq.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(eq.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\index.md --- title: Audio processing and effect blocks description: VisioForge Media Blocks SDK .Net - Audio processing and effect blocks sidebar_label: Audio processing and effects --- # Audio processing blocks - [Amplify](AmplifyBlock.md) - [Audio mixer](AudioMixerBlock.md) - [Audio sample grabber](AudioSampleGrabberBlock.md) - [Balance](AudioBalanceBlock.md) - [Compressor/Expander](CompressorExpanderBlock.md) - [Echo](EchoBlock.md) - [Equalizer (10 bands)](Equalizer10Block.md) - [Equalizer (Parametric)](EqualizerParametricBlock.md) - [Scale/Tempo](ScaleTempoBlock.md) - [Volume](VolumeBlock.md) - [VU Meter](VUMeterBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\ScaleTempoBlock.md --- title: Scale/Tempo effect block description: VisioForge Media Blocks SDK .Net - Scale/Tempo effect block sidebar_label: Scale/Tempo --- # Scale/Tempo The block scales tempo while maintaining pitch. ## Block info Name: ScaleTempoBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ScaleTempoBlock; ScaleTempoBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var tempo = new ScaleTempoBlock(); tempo.Overlap = 0.2; tempo.Rate = 0.1; tempo.Search = TimeSpan.FromMilliseconds(14); tempo.Stride = TimeSpan.FromMilliseconds(30); pipeline.Connect(fileSource.AudioOutput, tempo.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(tempo.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\VolumeBlock.md --- title: Volume effect block description: VisioForge Media Blocks SDK .Net - Volume effect block sidebar_label: Volume --- # Volume The volume block changes the volume of the audio data. Use the Level property to set the parameters. ## Block info Name: VolumeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VolumeBlock; VolumeBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var volume = new VolumeBlock(); volume.Level = 2.0; pipeline.Connect(fileSource.AudioOutput, volume.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(volume.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioProcessing\VUMeterBlock.md --- title: VU meter block description: VisioForge Media Blocks SDK .Net - VU meter block sidebar_label: VU meter --- # VU meter The VU meter block processes the audio stream and provides data for the UI element. ## Block info Name: VUMeterBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VUMeterBlock; VUMeterBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vumeter = new VUMeterBlock(); vumeter.OnAudioVUMeter += VUMeter_OnAudioVUMeter; pipeline.Connect(fileSource.AudioOutput, vumeter.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(vumeter.Output, audioRenderer.Input); await pipeline.StartAsync(); private void VUMeter_OnAudioVUMeter(object sender, VisioForge.Core.Types.X.VUMeterXEventArgs e) { } ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\AudioRendering\index.md --- title: Audio rendering block description: VisioForge Media Blocks SDK .Net - Audio rendering block sidebar_label: Audio rendering --- # Audio rendering ## AudioRendererBlock The Audio Renderer block is used to play the audio stream on the selected or default device. Volume and mute options are available. ### Block info Name: AudioRendererBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | uncompressed audio | 1 | ### Enumerate available devices Use the `AudioRendererBlock.GetDevices` method to get a list of available devices. ### The sample pipeline ```mermaid graph LR; SystemAudioSourceBlock-->AudioRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\BridgeAudioSinkBlock.md --- title: Bridge audio sink block description: VisioForge Media Blocks SDK .Net - Bridge audio sink block sidebar_label: Bridge audio sink --- # Bridge audio sink Bridges can be used to connect different media pipelines and use them independently. BridgeAudioSinkBlock is used to connect to BridgeAudioSourceBlock. Each bridge pair has a unique channel name. ## Block info Name: BridgeAudioSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | uncompressed audio | 1 | ## Sample pipelines ### First pipeline with an audio source and a bridge audio sink ```mermaid graph LR; VirtualAudioSourceBlock-->BridgeAudioSinkBlock; ``` ### Second pipeline with a bridge audio source and an audio renderer ```mermaid graph LR; BridgeAudioSourceBlock-->AudioRendererBlock; ``` ## Sample code ```csharp // source pipeline with virtual audio source and bridge audio sink var sourcePipeline = new MediaBlocksPipeline(true); var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var bridgeAudioSink = new BridgeAudioSinkBlock(new BridgeAudioSinkSettings()); sourcePipeline.Connect(audioSourceBlock.Output, bridgeAudioSink.Input); await sourcePipeline.StartAsync(); // sink pipeline with bridge audio source and audio renderer var sinkPipeline = new MediaBlocksPipeline(true); var bridgeAudioSource = new BridgeAudioSourceBlock(new BridgeAudioSourceSettings()); var audioRenderer = new AudioRendererBlock(); sinkPipeline.Connect(bridgeAudioSource.Output, audioRenderer.Input); await sinkPipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\BridgeAudioSourceBlock.md --- title: Bridge audio source block description: VisioForge Media Blocks SDK .Net - Bridge audio source block sidebar_label: Bridge audio source --- # Bridge audio source Bridges can be used to connect different media pipelines and use them independently. BridgeAudioSourceBlock is used to connect to BridgeAudioSinkBlock. Each bridge pair has a unique channel name. ## Block info Name: BridgeAudioSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | uncompressed audio | 1 | ## Sample pipelines ### First pipeline with an audio source and a bridge audio sink ```mermaid graph LR; VirtualAudioSourceBlock-->BridgeAudioSinkBlock; ``` ### Second pipeline with a bridge audio source and an audio renderer ```mermaid graph LR; BridgeAudioSourceBlock-->AudioRendererBlock; ``` ## Sample code ```csharp // source pipeline with virtual audio source and bridge audio sink var sourcePipeline = new MediaBlocksPipeline(true); var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var bridgeAudioSink = new BridgeAudioSinkBlock(new BridgeAudioSinkSettings()); sourcePipeline.Connect(audioSourceBlock.Output, bridgeAudioSink.Input); await sourcePipeline.StartAsync(); // sink pipeline with bridge audio source and audio renderer var sinkPipeline = new MediaBlocksPipeline(true); var bridgeAudioSource = new BridgeAudioSourceBlock(new BridgeAudioSourceSettings()); var audioRenderer = new AudioRendererBlock(); sinkPipeline.Connect(bridgeAudioSource.Output, audioRenderer.Input); await sinkPipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\BridgeSubtitleSinkBlock.md --- title: Bridge subtitle sink block description: VisioForge Media Blocks SDK .Net - Bridge subtitle sink block sidebar_label: Bridge subtitle sink --- # Bridge subtitle sink block Bridges can be used to connect different media pipelines and use them independently. BridgeSubtitleSinkBlock is used to connect to the BridgeSubtitleSourceBlock. Each bridge pair has a unique channel name. ## Block info Name: BridgeSubtitleSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | text | 1 | ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\BridgeSubtitleSourceBlock.md --- title: Bridge subtitle source block description: VisioForge Media Blocks SDK .Net - Bridge subtitle source block sidebar_label: Bridge subtitle source --- # Bridge subtitle source Bridges can be used to connect different media pipelines and use them independently. BridgeSubtitleSourceBlock is used to connect to the BridgeSubtitleSinkBlock. Each bridge pair has a unique channel name. ## Block info Name: BridgeSubtitleSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | text | 1 | ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\BridgeVideoSinkBlock.md --- title: Bridge video sink block description: VisioForge Media Blocks SDK .Net - Bridge video sink block sidebar_label: Bridge video sink --- # Bridge video sink Bridges can be used to connect different media pipelines and use them independently. BridgeVideoSinkBlock is used to connect to the BridgeVideoSourceBlock. Each bridge pair have a unique channel name. ## Block info Name: BridgeVideoSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | uncompressed video | 1 | ## Sample pipelines ### First pipeline with a video source and a bridge video sink ```mermaid graph LR; VirtualVideoSourceBlock-->BridgeVideoSinkBlock; ``` ### Second pipeline with a bridge video source and a video renderer ```mermaid graph LR; BridgeVideoSourceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp // source pipeline with virtual video source and bridge video sink var sourcePipeline = new MediaBlocksPipeline(true); var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var bridgeVideoSink = new BridgeVideoSinkBlock(new BridgeVideoSinkSettings()); sourcePipeline.Connect(videoSourceBlock.Output, bridgeVideoSink.Input); await sourcePipeline.StartAsync(); // sink pipeline with bridge video source and video renderer var sinkPipeline = new MediaBlocksPipeline(true); var bridgeVideoSource = new BridgeVideoSourceBlock(new BridgeVideoSourceSettings()); var videoRenderer = new VideoRendererBlock(sinkPipeline, VideoView1); sinkPipeline.Connect(bridgeVideoSource.Output, videoRenderer.Input); await sinkPipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\BridgeVideoSourceBlock.md --- title: Bridge video source block description: VisioForge Media Blocks SDK .Net - Bridge video source block sidebar_label: Bridge video source --- # Bridge video source Bridges can be used to connect different media pipelines and use them independently. BridgeVideoSourceBlock is used to connect to the BridgeVideoSinkBlock. Each bridge pair have a unique channel name. ## Block info Name: BridgeVideoSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | uncompressed video | 1 | ## Sample pipelines ### First pipeline with a video source and a bridge video sink ```mermaid graph LR; VirtualVideoSourceBlock-->BridgeVideoSinkBlock; ``` ### Second pipeline with a bridge video source and a video renderer ```mermaid graph LR; BridgeVideoSourceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp // source pipeline with virtual video source and bridge video sink var sourcePipeline = new MediaBlocksPipeline(true); var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var bridgeVideoSink = new BridgeVideoSinkBlock(new BridgeVideoSinkSettings()); sourcePipeline.Connect(videoSourceBlock.Output, bridgeVideoSink.Input); await sourcePipeline.StartAsync(); // sink pipeline with bridge video source and video renderer var sinkPipeline = new MediaBlocksPipeline(true); var bridgeVideoSource = new BridgeVideoSourceBlock(new BridgeVideoSourceSettings()); var videoRenderer = new VideoRendererBlock(sinkPipeline, VideoView1); sinkPipeline.Connect(bridgeVideoSource.Output, videoRenderer.Input); await sinkPipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\index.md --- title: Bridge blocks description: VisioForge Media Blocks SDK .Net - Bridge blocks sidebar_label: Bridges --- # Bridge blocks Bridges can be used to link two pipelines and dynamically switch between them. For example, you can switch between different files or cameras in the first Pipeline without interrupting streaming in the second Pipeline. To link source and sink, give them the same name. - [Bridge Audio Sink](BridgeAudioSinkBlock.md) - [Bridge Video Sink](BridgeVideoSinkBlock.md) - [Bridge Subtitle Sink](BridgeSubtitleSinkBlock.md) - [Bridge Audio Source](BridgeAudioSourceBlock.md) - [Bridge Video Source](BridgeVideoSourceBlock.md) - [Bridge Subtitle Source](BridgeSubtitleSourceBlock.md) - [Proxy Sink](ProxySinkBlock.md) - [Proxy Source](ProxySourceBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\ProxySinkBlock.md --- title: Proxy sink block description: VisioForge Media Blocks SDK .Net - Proxy sink block sidebar_label: Proxy sink --- # Proxy sink Please check the [ProxySourceBlock](ProxySourceBlock.md) for more information. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Bridge\ProxySourceBlock.md --- title: Proxy source block description: VisioForge Media Blocks SDK .Net - Proxy source block sidebar_label: Proxy source --- # Proxy source Proxy source/proxy sink pair of blocks can be used to connect different media pipelines and use them independently. ## Block info Name: ProxySourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output | Any uncompressed | 1 | ## Sample pipelines ### First pipeline with a video source and a proxy video sink ```mermaid graph LR; VirtualVideoSourceBlock-->ProxySinkBlock; ``` ### Second pipeline with a proxy video source and a video renderer ```mermaid graph LR; ProxySourceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp // source pipeline with virtual video source and proxy sink var sourcePipeline = new MediaBlocksPipeline(true); var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var proxyVideoSink = new ProxySinkBlock(); sourcePipeline.Connect(videoSourceBlock.Output, proxyVideoSink.Input); // sink pipeline with proxy video source and video renderer var sinkPipeline = new MediaBlocksPipeline(true); var proxyVideoSource = new ProxySourceBlock(proxyVideoSink); var videoRenderer = new VideoRendererBlock(sinkPipeline, VideoView1); sinkPipeline.Connect(proxyVideoSource.Output, videoRenderer.Input); // start pipelines await sourcePipeline.StartAsync(); await sinkPipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Decklink\DecklinkAudioSinkBlock.md --- title: Decklink Audio Sink block description: VisioForge Media Blocks SDK .Net - Decklink Audio Sink block sidebar_label: Decklink Audio Sink --- # Decklink Audio Sink block ## Sample applications - [Decklink Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Decklink%20Demo) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Decklink\DecklinkAudioSourceBlock.md --- title: Decklink Audio Source block description: VisioForge Media Blocks SDK .Net - Decklink Audio Source block sidebar_label: Decklink Audio Source --- # Decklink Audio Source block ## Sample applications - [Decklink Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Decklink%20Demo) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Decklink\DecklinkVideoSinkBlock.md --- title: Decklink Video Sink block description: VisioForge Media Blocks SDK .Net - Decklink Video Sink block sidebar_label: Decklink Video Sink --- # Decklink Video Sink block ## Sample applications - [Decklink Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Decklink%20Demo) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Decklink\DecklinkVideoSourceBlock.md --- title: Decklink Video Source block description: VisioForge Media Blocks SDK .Net - Decklink Video Source block sidebar_label: Decklink Video Source --- # Decklink Video Source block ## Sample applications - [Decklink Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Decklink%20Demo) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Decklink\index.md --- title: Blackmagic Decklink devices blocks description: VisioForge Media Blocks SDK .Net - Blackmagic Decklink devices blocks sidebar_label: Blackmagic Decklink devices --- # Blackmagic Decklink devices blocks - [Decklink Audio Sink](DecklinkAudioSinkBlock.md) - [Decklink Video Sink](DecklinkVideoSinkBlock.md) - [Decklink Audio Source](DecklinkAudioSourceBlock.md) - [Decklink Video Source](DecklinkVideoSourceBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\ASFSinkBlock.md --- title: ASF Sink block description: VisioForge Media Blocks SDK .Net - ASF Sink block sidebar_label: ASF (WMV/WMA) --- # ASF/WMV/WMA output `ASF (Advanced Systems Format)`: A Microsoft digital container format used to store multimedia data, designed to be platform-independent and to support scalable media types like audio and video. Use the `ASFSinkSettings` class to set the parameters. ## Block info Name: AVISinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/x-wma | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-divx | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-dv | | | | video/x-huffyuv | | | | video/x-wmv | | | | video/x-jpc | | | | video/x-vp8 | | | | image/png | | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WMVEncoderBlock; UniversalSourceBlock-->WMAEncoderBlock; WMVEncoderBlock-->ASFSinkBlock; WMAEncoderBlock-->ASFSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new WMAEncoderBlock(new WMAEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new WMVEncoderBlock(new WMVEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new ASFSinkBlock(new ASFSinkSettings(@"output.wmv")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\AVISinkBlock.md --- title: AVI Sink block description: VisioForge Media Blocks SDK .Net - AVI Sink block sidebar_label: AVI --- # AVI output AVI sink is used to create AVI files, and it is popular among Windows users. AVI files support many different video, audio, and subtitle formats. Use the `AVISinkSettings` class to set the parameters. ## Block info Name: AVISinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/x-wma | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-divx | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-dv | | | | video/x-huffyuv | | | | video/x-wmv | | | | video/x-jpc | | | | video/x-vp8 | | | | image/png | | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; UniversalSourceBlock-->MP3EncoderBlock; H264EncoderBlock-->AVISinkBlock; MP3EncoderBlock-->AVISinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mp3EncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, mp3EncoderBlock.Input); var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var aviSinkBlock = new AVISinkBlock(new AVISinkSettings(@"output.avi")); pipeline.Connect(h264EncoderBlock.Output, aviSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(mp3EncoderBlock.Output, aviSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\FacebookLiveSinkBlock.md --- title: Facebook Live Sink block description: VisioForge Media Blocks SDK .Net - Facebook Live Sink block sidebar_label: Facebook Live --- # Facebook Live streaming Facebook Live supports RTMP (Real-Time Messaging Protocol), a protocol for streaming audio, video, and data over the Internet. RTMP is used to maintain low-latency connections and deliver high-quality live broadcasts on Facebook, making it a popular choice for streaming live events and interactions. Use the `FacebookLiveSinkSettings` class to set the parameters. ## Block info Name: FacebookLiveSinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input audio | audio/x-aac | one | | | audio/x-mp3 | | | Input video | video/x-h264 | one | | | video/x-h265 | | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; VirtualAudioSourceBlock-->AACEncoderBlock; H264EncoderBlock-->FacebookLiveSinkBlock; AACEncoderBlock-->FacebookLiveSinkBlock; ``` ## Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(true); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // H264/AAC encoders var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); var aacEncoder = new AACEncoderBlock(); pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // Facebook Live sink var facebookSink = new FacebookLiveSinkBlock(new FacebookLiveSinkSettings("long streaming key")); pipeline.Connect(h264Encoder.Output, facebookSink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, facebookSink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` ### Sample applications - [Network Streamer Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Networks%20Streamer%20Demo) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\FileSinkBlock.md --- title: File Sink block description: VisioForge Media Blocks SDK .Net - File Sink block sidebar_label: File --- # File sink Universal output to a file. This sink is used inside all other higher-level sinks, e.g. MP4Sink. Can be used to write RAW video or audio to a file. ## Block info Name: FileSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input | Any stream format | 1 | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP3EncoderBlock; MP3EncoderBlock-->AVISinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mp3EncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, mp3EncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.mp3"); pipeline.Connect(mp3EncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\HLSSinkBlock.md --- title: HLS Sink block description: VisioForge Media Blocks SDK .Net - HLS Sink block sidebar_label: HLS --- # HLS output HLS sink block can be used to create an HLS server from any video/audio source. SDK includes an optional HTTP server. Alternatively, you can use IIS, Nginx, Apache, or any other web server. Use the `HLSSinkSettings` class to set the parameters. ## Block info Name: HLSSinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input audio | audio/x-raw | one | | | audio/x-ac3 | | | | audio/x-aac | | | | audio/x-mp3 | | | Input video | video/x-raw | one | | | video/x-h264 | | | | video/x-h265 | | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; VirtualAudioSourceBlock-->AACEncoderBlock; H264EncoderBlock-->HLSSinkBlock; AACEncoderBlock-->HLSSinkBlock; ``` ## Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(true); pipeline.OnError += Pipeline_OnError; // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // H264/AAC encoders var h264Settings = new OpenH264EncoderSettings(); var h264Encoder = new H264EncoderBlock(h264Settings); var aacEncoder = new AACEncoderBlock(); pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // HLS sink var settings = new HLSSinkSettings { Location = Path.Combine(AppContext.BaseDirectory, "segment_%05d.ts"), MaxFiles = 10, PlaylistLength = 5, PlaylistLocation = Path.Combine(AppContext.BaseDirectory, "playlist.m3u8"), PlaylistRoot = "http://localhost:8088/", SendKeyframeRequests = true, TargetDuration = 5, Custom_HTTP_Server_Enabled = true, Custom_HTTP_Server_Port = 8088 }; var hlsSink = new HLSSinkBlock(settings); // Connect everything pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); pipeline.Connect(h264Encoder.Output, hlsSink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, hlsSink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start pipeline.Start(); ``` ### Sample applications - [Network Streamer Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Networks%20Streamer%20Demo) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\HTTPMJPEGLiveSinkBlock.md --- title: HTTP MJPEG Live Sink block description: VisioForge Media Blocks SDK .Net - HTTP MJPEG Live Sink block sidebar_label: HTTP MJPEG Live --- # HTTP MJPEG live streaming output `MJPEG over HTTP`: MJPEG (Motion JPEG) is a video format where each video frame or interlaced field of a digital video sequence is compressed separately as a JPEG image. Streaming it over HTTP allows for easy transmission of live or recorded video over the Internet, without the need for complex player or server software, suitable for surveillance and webcam applications. Use the class class constructor to set the network port. ## Block info Name: HTTPMJPEGLiveSinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input video | UYVY, I420, NV12, NV21, YV12 | one | | | BGRA, BGRx, RGBA, RGBx | | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->HTTPMJPEGLiveSinkBlock; ``` ## Sample code The following code shows how to create a pipeline that streams a MJPEG video stream using HTTP. ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(true); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); // MJPEG HTTP sink var sink = new HTTPMJPEGLiveSinkBlock(8080); pipeline.Connect(videoSource.Output, sink.Input); // Start await pipeline.StartAsync(); ``` ### Sample applications - [Network Streamer Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Networks%20Streamer%20Demo) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\index.md --- title: Sink blocks description: VisioForge Media Blocks SDK .Net - Sink blocks sidebar_label: Sinks --- # Sinks Sinks are blocks that save or stream data. They are the last blocks in the pipeline. Optionally, some sinks can have output pins to pass data to the next block in the pipeline. SDK provides a lot of different sinks for different purposes. ## File sinks The following file sinks are available: - [ASF](ASFSinkBlock.md) - [AVI](AVISinkBlock.md) - [File](FileSinkBlock.md) - [MKV](MKVSinkBlock.md) - [MOV](MOVSinkBlock.md) - [MP4](MP4SinkBlock.md) - [MPEG-PS](MPEGPSSinkBlock.md) - [MPEG-TS](MPEGTSSinkBlock.md) - [MXF](MXFSinkBlock.md) - [OGG](OGGSinkBlock.md) - [WAV](WAVSinkBlock.md) - [WebM](WebMSinkBlock.md) ## Network streaming The following network streaming sinks are available: - [Facebook Live](FacebookLiveSinkBlock.md) - [HLS](HLSSinkBlock.md) - [MJPEG over HTTP](HTTPMJPEGLiveSinkBlock.md) - [NDI](NDISinkBlock.md) - [SRT](SRTSinkBlock.md) - [SRT MPEG-TS](SRTMPEGTSSinkBlock.md) - [RTMP](RTMPSinkBlock.md) - [YouTube Live](YouTubeSinkBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\MKVSinkBlock.md --- title: MKV Sink block description: VisioForge Media Blocks SDK .Net - MKV Sink block sidebar_label: MKV --- # MKV output The MKV format can be used as an alternative to the MP4 format, which has more options for supporting video and audio codecs. Use the `MKVSinkSettings` class to set the parameters. ## Block info Name: MKVSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/mpeg | one or more | | | audio/x-ac3 | | | | audio/x-eac3 | | | | audio/x-dts | | | | audio/x-vorbis | | | | audio/x-flac | | | | audio/x-opus | | | | audio/x-speex | | | | U8, S16BE, S16LE | | | | S24BE, S24LE | | | | S32BE, S32LE | | | | F32LE, F64LE | | | | audio/x-wma | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/x-adpcm | | | | audio/G722 | | | | audio/G726 | | | Input video | YUY2, I420, YV12, UYVY, AYUV, GRAY8, BGR, RGB | one or more | | | video/mpeg | | | | video/x-h264 | | | | video/x-h265 | | | | video/x-divx | | | | video/x-huffyuv | | | | video/x-dv | | | | video/x-h263 | | | | video/x-msmpeg | | | | image/jpeg: | | | | video/x-theora | | | | video/x-dirac | | | | video/x-vp8 | | | | video/x-vp9 | | | | video/x-prores | | | | video/x-wmv | | | | video/x-av1 | | | | video/x-ffv | | | Input subtitle | text/utf8 | one or more | | | subtitle/x-kate | | | | application/x-ssa | | | | application/x-ass | | | | application/x-usf | | | | subpicture/x-dvd | | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; UniversalSourceBlock-->AACEncoderBlock; H264EncoderBlock-->MKVSinkBlock; AACEncoderBlock-->MKVSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var aacEncoderBlock = new AACEncoderBlock(new MFAACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, aacEncoderBlock.Input); var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var mkvSinkBlock = new MKVSinkBlock(new MKVSinkSettings(@"output.mkv")); pipeline.Connect(h264EncoderBlock.Output, mkvSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoderBlock.Output, mkvSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\MOVSinkBlock.md --- title: MOV Sink block description: VisioForge Media Blocks SDK .Net - MOV Sink block sidebar_label: MOV --- # MOV output The MOV format is popular on Apple devices. Typically, AAC is used to encode the audio stream, and H264 or HEVC codecs are used to encode the video stream. Use the `MOVSinkSettings` class to set the parameters. ## Block info Name: MOVSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | S32LE, S32BE, S24LE, S24BE | one or more | | | S16LE, S16BE, S8, U8 | | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-adpcm | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/AMR | | | | audio/AMR-WB | | | | audio/x-alac | | | | audio/x-opus | | | Input video | RGB, UYVY, v210 | one or more | | | video/mpeg | | | | video/x-divx | | | | video/x-prores | | | | video/x-cineform | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-h265 | | | | video/x-svq | | | | video/x-dv | | | | image/jpeg | | | | image/png | | | | video/x-vp8 | | | | video/x-vp9 | | | | video/x-dirac | | | | video/x-av1 | | | Input subtitle | text/utf8 | one or more | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; UniversalSourceBlock-->AACEncoderBlock; H264EncoderBlock-->MOVSinkBlock; AACEncoderBlock-->MOVSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var aacEncoderBlock = new AACEncoderBlock(new MFAACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, aacEncoderBlock.Input); var h264EncoderBlock = new H264EncoderBlock(new NVENCH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var movSinkBlock = new MOVSinkBlock(new MOVSinkSettings(@"output.mov")); pipeline.Connect(h264EncoderBlock.Output, movSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoderBlock.Output, movSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\MP4SinkBlock.md --- title: MP4 Sink block description: VisioForge Media Blocks SDK .Net - MP4 Sink block sidebar_label: MP4 --- # MP4 output MP4 is the most popular video format available for all platforms. Typically, AAC is used to encode the audio stream, and H264 or HEVC codecs are used to encode the video stream. Use the `MP4SinkSettings` class to set the parameters. ## Block info Name: MP4SinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/mpeg | one or more | | | audio/x-ac3 | | | | audio/x-alac | | | | audio/x-opus | | | Input video | video/mpeg | one or more | | | video/x-divx | | | | video/x-h264 | | | | video/x-h265 | | | | video/x-av1 | | | Input subtitle | text/utf8 | one or more | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; UniversalSourceBlock-->AACEncoderBlock; H264EncoderBlock-->MP4SinkBlock; AACEncoderBlock-->MP4SinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var aacEncoderBlock = new AACEncoderBlock(new MFAACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, aacEncoderBlock.Input); var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings("output.mp4")); pipeline.Connect(h264EncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ### Sample applications - [Simple Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) - [Screen Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Screen%20Capture) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\MPEGPSSinkBlock.md --- title: MPEG-PS Sink block description: VisioForge Media Blocks SDK .Net - MPEG-PS Sink block sidebar_label: MPEG-PS --- # MPEG-PS output `MPEG-PS (MPEG Program Stream)`: A standard format for storing video, audio, and metadata multiplexed into a single stream. It is widely used in systems and applications requiring synchronized audio and video playback, such as DVDs. Use the constructor to set the output file name. ## Block info Name: MPEGPSSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/mpeg [1,2,4] | one or more | | | audio/x-lpcm | | | Input video | video/mpeg [1,2,4] | one or more | | | video/x-dirac | | | | video/x-h264 | | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; UniversalSourceBlock-->AACEncoderBlock; H264EncoderBlock-->TSSinkBlock; AACEncoderBlock-->TSSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var aacEncoderBlock = new AACEncoderBlock(new MFAACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, aacEncoderBlock.Input); var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var tsSinkBlock = new MPEGPSSinkBlock(@"output.mpg"); pipeline.Connect(h264EncoderBlock.Output, tsSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoderBlock.Output, tsSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\MPEGTSSinkBlock.md --- title: MPEG-TS Sink block description: VisioForge Media Blocks SDK .Net - MPEG-TS Sink block sidebar_label: MPEG-TS --- # MPEG-TS output MPEG transport stream is a standard digital container format for the transmission and storage of audio, video, and PSIP data. It is used in broadcast systems such as DVB, ATSC, and IPTV. Use the `MPEGTSSinkSettings` class to set the parameters. ## Block info Name: MPEGTSSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/mpeg [1,2,4] | one or more | | | audio/x-lpcm | | | | audio/x-ac3 | | | | audio/x-dts | | | | audio/x-opus | | | Input video | video/mpeg [1,2,4] | one or more | | | video/x-dirac | | | | video/x-h264 | | | | video/x-h265 | | | Input subtitle | meta/x-klv | one or more | | | subpicture/x-dvb | | | | application/x-teletext | | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; UniversalSourceBlock-->AACEncoderBlock; H264EncoderBlock-->MPEGTSSinkBlock; AACEncoderBlock-->MPEGTSSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var aacEncoderBlock = new AACEncoderBlock(new MFAACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, aacEncoderBlock.Input); var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var tsSinkBlock = new MPEGTSSinkBlock(new MPEGTSSinkSettings(@"output.ts")); pipeline.Connect(h264EncoderBlock.Output, tsSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoderBlock.Output, tsSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\MXFSinkBlock.md --- title: MXF Sink block description: VisioForge Media Blocks SDK .Net - MXF Sink block sidebar_label: MXF --- # MXF output `MXF (Material Exchange Format)`: A container format designed for professional digital video and audio media, defined by SMPTE standards. It is used in the broadcasting industry to support stream-based workflows with full metadata and timecode support. Use the `MXFSinkSettings` class to set the parameters. ## Block info Name: MXFSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | PCM (S16, S24, S32, U8) | one or more | | | audio/x-alaw | | | | audio/x-ac3 | | | | audio/mpeg [1,2] | | | Input video | video/mpeg [1,2,4] | one or more | | | video/x-dv | | | | video/x-h264 | | | | video/x-dnxhd | | | | RGB/RGBA/YUV | | | | image/x-jpc | | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; UniversalSourceBlock-->MP2EncoderBlock; H264EncoderBlock-->MXFSinkBlock; MP2EncoderBlock-->MXFSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new MP2EncoderBlock(new MP2EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var sinkBlock = new MXFSinkBlock(new MXFSinkSettings(@"output.mxf", MXFVideoStreamType.H264, MXFAudioStreamType.MPEG)); pipeline.Connect(h264EncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\NDISinkBlock.md --- title: NDI Sink block description: VisioForge Media Blocks SDK .Net - NDI Sink block sidebar_label: NDI --- # NDI streaming output `NDI (Network Device Interface)`: Developed by NewTek, NDI is a royalty-free technology that allows video-compatible products to communicate, deliver, and receive high-quality, low-latency video and audio over IP networks. This makes it ideal for live video production environments. Use the `NDISinkSettings` class to set the parameters. ## Block info Name: NDISinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input audio | F32LE | one | | Input video | UYVY, I420, NV12, NV21, YV12 | one | | | BGRA, BGRx, RGBA, RGBx | | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->NDISinkBlock; VirtualAudioSourceBlock-->NDISinkBlock; ``` ## Sample code The following code shows how to create a pipeline that streams video and audio to an NDI sink. ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(true); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // NDI sink var ndiSink = new NDISinkBlock(new NDISinkSettings("NDITestOutput")); pipeline.Connect(videoSource.Output, ndiSink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(audioSource.Output, ndiSink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\OGGSinkBlock.md --- title: OGG Sink block description: VisioForge Media Blocks SDK .Net - OGG Sink block sidebar_label: OGG --- # OGG output `OGG`: An open container format that is free of software patents and is designed to efficiently stream and manipulate high-quality digital multimedia. It encompasses a range of codecs, with Vorbis being the most commonly used for audio compression. Use the `OGGSinkSettings` class to set the parameters. ## Block info Name: OGGSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-vorbis | one or more | | | audio/x-flac | | | | audio/x-speex | | | | audio/x-celt | | | | application/x-ogm-audio | | | | audio/x-opus | | | Input video | video/x-theora | one or more | | | application/x-ogm-video | | | | video/x-dirac | | | | video/x-smoke | | | | video/x-vp8 | | | | video/x-daala | | | Input subtitle | text/x-cmml | one or more | | | subtitle/x-kate | | | | application/x-kate | | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VorbisEncoderBlock; VorbisEncoderBlock-->OGGSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vorbisEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, vorbisEncoderBlock.Input); var oggSinkBlock = new OGGSinkBlock(new OGGSinkSettings(@"output.ogg")); pipeline.Connect(vorbisEncoderBlock.Output, oggSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\RTMPSinkBlock.md --- title: RTMP Sink block description: VisioForge Media Blocks SDK .Net - RTMP Sink block sidebar_label: RTMP --- # RTMP streaming `RTMP (Real-Time Messaging Protocol)`: Developed by Adobe, RTMP is a protocol used for streaming audio, video, and data over the Internet, optimized for high-performance transmission. It enables efficient, low-latency communication, commonly used in live broadcasting like sports events and concerts. Use the `RTMPSinkSettings` class to set the parameters. ## Block info Name: RTMPSinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input audio | audio/mpeg [1,2,4] | one | | | audio/x-adpcm | | | PCM [U8, S16LE] | | | | audio/x-speex | | | | audio/x-mulaw | | | | audio/x-alaw | | | | audio/x-nellymoser | | | Input video | video/x-h264 | one | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; VirtualAudioSourceBlock-->AACEncoderBlock; H264EncoderBlock-->RTMPSinkBlock; AACEncoderBlock-->RTMPSinkBlock; ``` ## Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(true); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // H264/AAC encoders var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); var aacEncoder = new AACEncoderBlock(); pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // RTMP sink var sink = new RTMPSinkBlock(new RTMPSinkSettings()); pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\SRTMPEGTSSinkBlock.md --- title: SRT MPEG-TS Sink block description: VisioForge Media Blocks SDK .Net - SRT MPEG-TS Sink block sidebar_label: SRT MPEG-TS --- # SRT (MPEG-TS) streaming `Secure Reliable Transport (SRT)` is an open-source streaming protocol designed to deliver low-latency video across unpredictable networks like the Internet. Developed by Haivision, SRT uses encryption and error-correction mechanisms to ensure secure and reliable data transmission. By adapting to changing network conditions in real-time, it minimizes jitter, packet loss, and bandwidth fluctuations. SRT supports features like packet retransmission and congestion control, making it ideal for live video streaming, remote contribution, and other latency-sensitive applications in broadcasting and media. Video and audio streams will be muxed into an MPEG-TS container and sent over the SRT protocol. Use the `SRTSinkSettings` class to set the parameters. ## Block info Name: SRTMPEGTSSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/mpeg [1,2,4] | one or more | | | audio/x-lpcm | | | | audio/x-ac3 | | | | audio/x-dts | | | | audio/x-opus | | | Input video | video/mpeg [1,2,4] | one or more | | | video/x-dirac | | | | video/x-h264 | | | | video/x-h265 | | | Input subtitle | meta/x-klv | one or more | | | subpicture/x-dvb | | | | application/x-teletext | | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; VirtualAudioSourceBlock-->AACEncoderBlock; H264EncoderBlock-->SRTMPEGTSSinkBlock; AACEncoderBlock-->SRTMPEGTSSinkBlock; ``` ## Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(true); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // H264/AAC encoders var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); h264Encoder.Settings.ParseStream = false; // we have to disable parsing for SRT for H264 and HEVC encoders var aacEncoder = new AACEncoderBlock(); pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // Sink var sink = new SRTMPEGTSSinkBlock(new SRTSinkSettings() { Uri = "srt://:8888" }); pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` ### Sample applications - [Network Streamer Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Networks%20Streamer%20Demo) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\SRTSinkBlock.md --- title: SRT Sink block description: VisioForge Media Blocks SDK .Net - SRT Sink block sidebar_label: SRT --- # SRT streaming The Secure Reliable Transport (SRT) protocol is an open-source communication protocol that facilitates high-quality, secure video streaming over unreliable networks like the internet. It is designed to handle video transport in scenarios where low latency and secure data transfer are crucial, making it ideal for broadcasting and streaming applications. SRT optimizes streaming performance by adapting to varying network conditions, providing packet loss recovery, and ensuring content security through end-to-end encryption. This protocol supports point-to-point and point-to-multipoint transmissions, offering a robust solution for modern streaming needs. Using this block, you can send video/audio streams using most of the available formats over the SRT protocol. You can use the [SRTMPEGTSSinkBlock](SRT MPEG-TS Sink block) to stream muxed video and audio data over the SRT protocol. Use the `SRTMPEGTSSinkSettings` class to set the parameters. ## Block info Name: SRTSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input | Any | one | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; H264EncoderBlock-->SRTSinkBlock; ``` ## Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(true); // video source var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); // H264 encoder var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); h264Encoder.Settings.ParseStream = false; // we have to disable parsing for SRT for H264 and HEVC encoders pipeline.Connect(videoSource.Output, h264Encoder.Input); // SRT sink var sink = new SRTSinkBlock(new SRTSinkSettings() { Uri = "srt://:8888" }); pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video)); // Start await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\WAVSinkBlock.md --- title: WAV Sink block description: VisioForge Media Blocks SDK .Net - WAV Sink block sidebar_label: WAV --- # WAV output `WAV (Waveform Audio File Format)`: Developed by IBM and Microsoft, this audio file format is a standard for storing an audio bitstream on PCs. It is uncompressed and thus retains high quality, making it ideal for professional audio recording and editing. Use the constructor to specify the file name. ## Block info Name: WAVSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | PCM [S32LE, S24LE, S16LE, U8] | one or more | | | IEEE [F32LE, F64LE] | | | | audio/x-mulaw | | | | audio/x-alaw | | ## The sample pipeline ```mermaid graph LR; VirtualAudioSourceBlock-->WAVSinkBlock; ``` ## Sample code The following code snippet shows how to create a pipeline with a WAV sink block. ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wavSinkBlock = new WAVSinkBlock(@"output.wav"); pipeline.Connect(fileSource.AudioOutput, wavSinkBlock.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\WebMSinkBlock.md --- title: WebM Sink block description: VisioForge Media Blocks SDK .Net - WebM Sink block sidebar_label: WebM --- # WebM output WebM is used on the Internet as a free alternative to the MP4 format, using the VP8/VP9 codec for video and Vorbis for audio. Use the `WebMSinkSettings` class to set the parameters. ## Block info Name: WebMSink. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | x-vorbis | one or more | | | audio/x-opus | | | Input video | video/x-vp8 | one or more | | | video/x-vp9 | | | | video/x-av1 | | | Input subtitle | text/utf8 | one or more | | | subtitle/x-kate | | | | application/x-ssa | | | | application/x-ass | | | | application/x-usf | | | | subpicture/x-dvd | | ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VPXEncoderBlock; UniversalSourceBlock-->VorbisEncoderBlock; VPXEncoderBlock-->WebMSink; VorbisEncoderBlock-->WebMSink; ``` ## Sample code The following sample code can convert the source video file to the WebM file with a VP8 video stream and Vorbis audio stream. ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vp8EncoderBlock = new VPXEncoderBlock(new VP8EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, vp8EncoderBlock.Input); var vorbisEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, vorbisEncoderBlock.Input); var webmSinkBlock = new WebMSinkBlock(new WebMSinkSettings(@"output.webm")); pipeline.Connect(vp8EncoderBlock.Output, webmSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(vorbisEncoderBlock.Output, webmSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sinks\YouTubeSinkBlock.md --- title: YouTube Live Sink block description: VisioForge Media Blocks SDK .Net - YouTube Live Sink block sidebar_label: YouTube Live --- # YouTube Live streaming YouTube supports RTMP (Real-Time Messaging Protocol) for live streaming, allowing content creators to broadcast live video and audio streams over the internet with minimal delay. This protocol is vital for delivering smooth, high-quality live feeds that are suitable for everything from personal vlogging to professional broadcasts. Use the `YouTubeSinkSettings` class to set the parameters. ## Block info Name: YouTubeSinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input audio | audio/x-aac | one | | | audio/x-mp3 | | | Input video | video/x-h264 | one | | | video/x-h265 | | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; VirtualAudioSourceBlock-->AACEncoderBlock; H264EncoderBlock-->YouTubeSinkBlock; AACEncoderBlock-->YouTubeSinkBlock; ``` ## Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(true); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // H264/AAC encoders var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); var aacEncoder = new AACEncoderBlock(); pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // YouTube Live sink var sink = new YouTubeSinkBlock(new YouTubeSinkSettings("long streaming key")); pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` ### Sample applications - [Network Streamer Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Networks%20Streamer%20Demo) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\BaslerSourceBlock.md --- title: Basler source block description: VisioForge Media Blocks SDK .Net - Basler source block sidebar_label: Basler source --- # Basler source block The Basler source block supports Basler USB3 Vision and GigE cameras. The Pylon SDK or Runtime should be installed to use the camera source. ## Block info Name: BaslerSourceBlock. | Pin direction | Media type | Pins count | |-----------------|:--------------------:|:-----------:| | Output video | Uncompressed | 1 | ## The sample pipeline ```mermaid graph LR; BaslerSourceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); // get Basler source info by enumerating sources var sources = await DeviceEnumerator.Shared.BaslerSourcesAsync(); var sourceInfo = sources[0]; // create Basler source var source = new BaslerSourceBlock(new BaslerSourceSettings(sourceInfo)); // create video renderer for VideoView var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // connect pipeline.Connect(source.Output, videoRenderer.Input); // start await pipeline.StartAsync(); ``` ### Sample applications - [Basler Source Demo (WPF)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Basler%20Source%20Demo) ## Platforms Windows, Linux. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\GenICamSourceBlock.md --- title: GenICam source block description: VisioForge Media Blocks SDK .Net - GenICam source block sidebar_label: GenICam source --- # GenICam source The GenICam source supports connection to GigE, and the USB3 Vision camera supports the GenICam protocol. ## Block info Name: GenICamSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | various | one or more | ## The sample pipeline ```mermaid graph LR; GenICamSourceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var sourceSettings = new GenICamSourceSettings(cbCamera.Text, new VisioForge.Core.Types.Rect(0, 0, 512, 512), 15, GenICamPixelFormat.Mono8); var source = new GenICamSourceBlock(sourceSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(source.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Sample applications - [GenICam Source Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/GenICam%20Source%20Demo) ## Prerequisites ### macOS Install the `Aravis` package using Homebrew: ```bash brew install aravis ``` ### Linux Install the `Aravis` package using the package manager: ```bash sudo apt-get install libaravis-0.8-dev ``` ### Windows Install the `VisioForge.CrossPlatform.GenICam.Windows.x64` package to your project using NuGet. ## Platforms Windows, macOS, Linux ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\HTTPSourceBlock.md --- title: HTTP source block description: VisioForge Media Blocks SDK .Net - HTTP source block sidebar_label: HTTP source --- # HTTP source block The HTTP source block allows data to be retrieved using HTTP/HTTPS protocols. It can be used to read data from MJPEG IP cameras, MP4 network files, or other sources. ## Block info Name: HTTPSourceBlock. | Pin direction | Media type | Pins count | |---------------|:------------:|:-----------:| | Output | Data | 1 | ## The sample pipeline The sample pipeline reads data from an MJPEG camera and displays it using VideoView. ```mermaid graph LR; HTTPSourceBlock-->JPEGDecoderBlock; JPEGDecoderBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var settings = new HTTPSourceSettings(new Uri("http://mjpegcamera:8080")) { UserID = "username", UserPassword = "password" }; var source = new HTTPSourceBlock(settings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); var jpegDecoder = new JPEGDecoderBlock(); pipeline.Connect(source.Output, jpegDecoder.Input); pipeline.Connect(jpegDecoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Sample applications - [HTTP MJPEG Source Demo](https://github.com/visioforge/.Net-SDK-s-samples/blob/master/Media%20Blocks%20SDK/WPF/CSharp/HTTP%20MJPEG%20Source%20Demo/) ## Platforms Windows, macOS, Linux. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\index.md --- title: Source blocks description: VisioForge Media Blocks SDK .Net - Source blocks sidebar_label: Sources --- # Sources Sources are blocks that provide data to the pipeline. They are the first blocks in the pipeline. SDK provides a lot of different sources for different purposes. ## Hardware sources - [System video source](SystemVideoSourceBlock.md) - [System audio source](SystemAudioSourceBlock.md) - [Decklink](../Decklink/index.md) ## File sources - [Universal source](UniversalSourceBlock.md) (can be used as a file or a network source) ## Network sources - [RTSP source](RTSPSourceBlock.md) - [HTTP source](HTTPSourceBlock.md) - [NDI source](NDISourceBlock.md) - [Basler camera source](BaslerSourceBlock.md) - [GenICam source](GenICamSourceBlock.md) - [SRT source](SRTSourceBlock.md) - [SRT RAW source](SRTRAWSourceBlock.md) ## Other sources - [Screen source](ScreenSourceBlock.md) - [Virtual video source](VirtualVideoSourceBlock.md) - [Virtual audio source](VirtualAudioSourceBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\NDISourceBlock.md --- title: NDI source block description: VisioForge Media Blocks SDK .Net - NDI source block sidebar_label: NDI source --- # NDI source block The NDI source block supports connection to NDI software sources and devices supporting the NDI protocol. ## Block info Name: NDISourceBlock. | Pin direction | Media type | Pins count | |-----------------|:--------------------:|:-----------:| | Output audio | Uncompressed | 1 | | Output video | Uncompressed | 1 | ## The sample pipeline ```mermaid graph LR; NDISourceBlock-->VideoRendererBlock; NDISourceBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); // get NDI source info by enumerating sources var ndiSources = await DeviceEnumerator.Shared.NDISourcesAsync(); var ndiSourceInfo = ndiSources[0]; // create NDI source settings var ndiSettings = NDISourceSettings.CreateAsync(ndiSourceInfo); var ndiSource = new NDISourceBlock(ndiSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(ndiSource.VideoOutput, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(ndiSource.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` ### Sample applications - [NDI Source Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/NDI%20Source%20Demo) ## Platforms Windows, macOS, Linux. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\RTSPSourceBlock.md --- title: RTSP source block description: VisioForge Media Blocks SDK .Net - RTSP source block sidebar_label: RTSP source --- # RTSP source The RTSP source supports connection to IP cameras and other devices supporting the RTSP protocol. Supported video codecs: H264, HEVC, MJPEG. Supported audio codecs: AAC, MP3, PCM, G726, G711, and some others if FFMPEG redist is installed. ## Block info Name: RTSPSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | depends from decoder | one or more | | Output video | depends from decoder | one or more | | Output subtitle | depends from decoder | one or more | ## The sample pipeline `RTSPSourceBlock:VideoOutput` → `VideoRendererBlock` `RTSPSourceBlock:AudioOutput` → `AudioRendererBlock` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var rtspSettings = new RTSPSourceSettings(new Uri("rtsp://login:pwd@192.168.1.64:554/Streaming/Channels/101?transportmode=unicast&profile=Profile_1"), true) { Login = "login", Password = "pwd" }; var rtspSource = new RTSPSourceBlock(rtspSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(rtspSource.VideoOutput, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(rtspSource.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` ### Sample applications - [RTSP Preview Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/RTSP%20Preview%20Demo) - [RTSP MultiViewSync Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/RTSP%20MultiViewSync%20Demo) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\ScreenSourceBlock.md --- title: Screen source block description: VisioForge Media Blocks SDK .Net - Screen source block sidebar_label: Screen source --- # Screen source Screen source supports recording video from the screen. You can select the display (if more than one), the part of the screen to be recorded, and optional mouse cursor recording. ## Settings ### [Windows] ScreenCaptureDX9SourceSettings Use `DirectX 9` for screen recording. ### [Windows] ScreenCaptureD3D11SourceSettings Use `Direct3D 11` for screen recording. ### [Windows] ScreenCaptureGDISourceSettings Use `GDI` for screen recording. ### [macOS] ScreenCaptureMacOSSourceSettings Use `AVFoundation` for screen recording. ### [Linux] ScreenCaptureXDisplaySourceSettings Use `X11` for screen recording. ## Block info Name: ScreenSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | uncompressed video | 1 | ## The sample pipeline ```mermaid graph LR; ScreenSourceBlock-->VideoRendererBlock; ``` ## [Windows] Window capture You can capture a specific window by using the `ScreenCaptureD3D11SourceSettings` class. ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var screenSourceBlock = new ScreenSourceBlock(new ScreenCaptureDX9SourceSettings() { FrameRate = 15 }); var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(screenSourceBlock.Output, h264EncoderBlock.Input); var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(h264EncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Sample applications - [Screen Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Screen%20Capture) ## Platforms Windows. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\SpinnakerSourceBlock.md --- title: Spinnaker source block description: VisioForge Media Blocks SDK .Net - Spinnaker source block sidebar_label: Spinnaker source --- # Spinnaker/FLIR source The Spinnaker/FLIR source supports connection to FLIR cameras using Spinnaker SDK. ## Block info Name: SpinnakerSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | various | one or more | ## The sample pipeline `SpinnakerSourceBlock:Output` → `VideoRendererBlock` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var sources = await DeviceEnumerator.Shared.SpinnakerSourcesAsync(); var sourceSettings = new SpinnakerSourceSettings(sources[0].Name, new VisioForge.Core.Types.Rect(0, 0, 1280, 720), new VideoFrameRate(10)); var source = new SpinnakerSourceBlock(sourceSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(source.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Requirements - Spinnaker SDK installed. ## Platforms Windows ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\SRTRAWSourceBlock.md --- title: SRT RAW source block description: VisioForge Media Blocks SDK .Net - SRT RAW source block sidebar_label: SRT RAW source --- # SRT RAW source `The Secure Reliable Transport (SRT)` is a streaming protocol that optimizes video data delivery over unpredictable networks, like the Internet. It is open-source and designed to handle high-performance video and audio streaming. SRT provides security through end-to-end encryption, reliability by recovering lost packets, and low latency, which is suitable for live broadcasts. It adapts to varying network conditions by dynamically managing bandwidth, ensuring high-quality streams even under suboptimal conditions. Widely used in broadcasting and streaming applications, SRT supports interoperability and is ideal for remote production and content distribution. The SRT source supports connection to SRT sources and provides a data stream. You can connect this block to `DecodeBinBlock` to decode the stream. ## Block info Name: SRTRAWSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output data | Any | one | ## The sample pipeline ```mermaid graph LR; SRTRAWSourceBlock-->DecodeBinBlock; DecodeBinBlock-->VideoRendererBlock; DecodeBinBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var source = new SRTRAWSourceBlock(new SRTSourceSettings() { Uri = edURL.Text }); var decodeBin = new DecodeBinBlock(); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(source.Output, decodeBin.Input); pipeline.Connect(decodeBin.VideoOutput, videoRenderer.Input); pipeline.Connect(decodeBin.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\SRTSourceBlock.md --- title: SRT source block description: VisioForge Media Blocks SDK .Net - SRT source block sidebar_label: SRT source --- # SRT source (with decoding) The `Secure Reliable Transport (SRT)` is an open-source video streaming protocol designed for secure and low-latency delivery over unpredictable networks, like the public internet. Developed by Haivision, SRT optimizes streaming performance by dynamically adapting to varying bandwidths and minimizing the effects of packet loss. It incorporates AES encryption for secure content transmission. Primarily used in broadcasting and online streaming, SRT is crucial for delivering high-quality video feeds in real-time applications, enhancing viewer experiences even in challenging network conditions. It supports point-to-point and multicast streaming, making it versatile for diverse setups. The SRT source block provides decoded video and audio streams from an SRT source. ## Block info Name: SRTSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | Uncompressed | 0+ | | Output audio | Uncompressed | 0+ | ## The sample pipeline ```mermaid graph LR; SRTSourceBlock-->VideoRendererBlock; SRTSourceBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var source = new SRTSourceBlock(new SRTSourceSettings() { Uri = edURL.Text }); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(source.VideoOutput, videoRenderer.Input); pipeline.Connect(source.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` ### Sample applications - [SRT Source Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/SRT%20Source%20Demo) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\SystemAudioSourceBlock.md --- title: System audio source block description: VisioForge Media Blocks SDK .Net - System audio source block sidebar_label: System audio source --- # System audio source SystemAudioSourceBlock is used to access mics and other audio capture devices. ## Block info Name: SystemAudioSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | uncompressed audio | 1 | ## Enumerate available devices Use the `DeviceEnumerator.Shared.AudioSourcesAsync()` method call to get a list of available devices and their specifications. During device enumeration, you can get the list of available devices and their specifications. You can select the device and its format to create the source settings. ## The sample pipeline ```mermaid graph LR; SystemAudioSourceBlock-->AudioRendererBlock; ``` ## Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(true); // create audio source block IAudioCaptureDeviceSourceSettings audioSourceSettings = null; // select first device var device = (await DeviceEnumerator.Shared.AudioSourcesAsync())[0]; if (device != null) { // select first format var formatItem = device.Formats[0]; if (formatItem != null) { audioSourceSettings = device.CreateSourceSettings(formatItem.ToFormat()); } } // create audio source block using selected device and format var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // create audio renderer block var audioRenderer = new AudioRendererBlock(); // connect blocks pipeline.Connect(audioSource.Output, audioRenderer.Input); // start pipeline await pipeline.StartAsync(); ``` ## Capture audio from speakers (loopback) Currently, loopback audio capture is supported only on Windows. Use the `LoopbackAudioCaptureDeviceSourceSettings` class to create the source settings for loopback audio capture. WASAPI2 is used as the default API for loopback audio capture. You can specify the API to use during device enumeration. ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(true); // create audio source block var deviceItem = (await DeviceEnumerator.Shared.AudioOutputsAsync(AudioOutputDeviceAPI.WASAPI2))[0]; if (deviceItem == null) { return; } var audioSourceSettings = new LoopbackAudioCaptureDeviceSourceSettings(deviceItem); var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // create audio renderer block var audioRenderer = new AudioRendererBlock(); // connect blocks pipeline.Connect(audioSource.Output, audioRenderer.Input); // start pipeline await pipeline.StartAsync(); ``` ### Sample applications - [Audio Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Audio%20Capture%20Demo) - [Simple Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) ## Remarks You can specify an API to use during the device enumeration. Android and iOS platforms have only one API, while Windows and Linux have multiple APIs. ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\SystemVideoSourceBlock.md --- title: System video source block description: VisioForge Media Blocks SDK .Net - System video source block sidebar_label: System video source --- # System video source SystemVideoSourceBlock is used to access webcams and other video capture devices. ## Block info Name: SystemVideoSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | uncompressed video | 1 | ## Enumerate available devices Use the `DeviceEnumerator.Shared.VideoSourcesAsync()` method to get a list of available devices and their specifications: available resolutions, frame rates, and video formats. ## The sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(true); // create video source VideoCaptureDeviceSourceSettings videoSourceSettings = null; // select the first device var device = (await DeviceEnumerator.Shared.VideoSourcesAsync())[0]; if (device != null) { // select the first format (maybe not the best, but it is just a sample) var formatItem = device.VideoFormats[0]; if (formatItem != null) { videoSourceSettings = new VideoCaptureDeviceSourceSettings(device) { Format = formatItem.ToFormat() }; // select the first frame rate videoSourceSettings.Format.FrameRate = formatItem.FrameRateList[0]; } } // create video source block using the selected device and format var videoSource = new SystemVideoSourceBlock(videoSourceSettings); // create video renderer block var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // connect blocks pipeline.Connect(videoSource.Output, videoRenderer.Input); // start pipeline await pipeline.StartAsync(); ``` ### Sample applications - [Simple Video Capture Demo (WPF)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) ## Remarks You can specify an API to use when enumerating devices. Windows and Linux platforms have multiple APIs, while Android and iOS platforms have only one API. ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\UniversalSourceBlock.md --- title: Universal source block description: VisioForge Media Blocks SDK .Net - Universal source block sidebar_label: Universal source --- # Universal source block A universal source that decodes video and audio files/network streams and provides uncompressed data to the connected blocks. Block supports MP4, WebM, AVI, TS, MKV, MP3, AAC, M4A, and many other formats. If FFMPEG redist is available, all decoders available in FFMPEG will also be supported. ## Block info Name: UniversalSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | depends from decoder | one or more | | Output video | depends from decoder | one or more | | Output subtitle | depends from decoder | one or more | ## Sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoRendererBlock; UniversalSourceBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var fileSource = new UniversalSourceBlock(); fileSource.Filename = "test.mp4"; var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(fileSource.VideoOutput, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(fileSource.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` ### Sample applications - [Simple Player Demo (WPF)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Player%20Demo%20WPF) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\VirtualAudioSourceBlock.md --- title: Virtual audio source block description: VisioForge Media Blocks SDK .Net - Virtual audio source block sidebar_label: Virtual audio source --- # Virtual audio source VirtualAudioSourceBlock is used to produce test audio data in a wide variety of audio formats. The type of test data is controlled by the settings. ## Block info Name: VirtualAudioSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | uncompressed audio | 1 | ## The sample pipeline ```mermaid graph LR; VirtualAudioSourceBlock-->AudioRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoSourceBlock.Output, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Sources\VirtualVideoSourceBlock.md --- title: Virtual video source block description: VisioForge Media Blocks SDK .Net - Virtual video source block sidebar_label: Virtual video source --- # Virtual video source VirtualVideoSourceBlock is used to produce test video data in a wide variety of video formats. The type of test data is controlled by the settings. ## Block info Name: VirtualVideoSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | uncompressed video | 1 | ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(true); var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoSourceBlock.Output, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Special\index.md --- title: Special blocks description: VisioForge Media Blocks SDK .Net - Special blocks sidebar_label: Special --- # Special blocks - [Null Renderer](NullRendererBlock.md) - [Tee](TeeBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Special\NullRendererBlock.md --- title: Null renderer block description: VisioForge Media Blocks SDK .Net - Null renderer block sidebar_label: Null renderer --- # Null renderer The null renderer block sends the data to null. This block may be required if your block has outputs you do not want to use. ## Block info Name: NullRendererBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Any | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->NullRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var nullRenderer = new NullRendererBlock(); pipeline.Connect(fileSource.AudioOutput, nullRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Special\SuperMediaBlock.md --- title: Super MediaBlock description: VisioForge Media Blocks SDK .Net - Super MediaBlock sidebar_label: Super MediaBlock --- # Super MediaBlock The null renderer block sends the data to null. This block may be required if your block has outputs you do not want to use. ## Block info Name: SuperMediaBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Any | 1 Output | Any | 1 ## The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->SuperMediaBlock; SuperMediaBlock-->NullRendererBlock; ``` Inside the SuperMediaBlock: ```mermaid graph LR; FishEyeBlock-->ColorEffectsBlock; ``` Final pipeline: ```mermaid graph LR; VirtualVideoSourceBlock-->FishEyeBlock; subgraph SuperMediaBlock FishEyeBlock-->ColorEffectsBlock; end ColorEffectsBlock-->NullRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var videoViewBlock = new VideoRendererBlock(pipeline, VideoView1); var videoSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var colorEffectsBlock = new ColorEffectsBlock(VisioForge.Core.Types.X.VideoEffects.ColorEffectsPreset.Sepia); var fishEyeBlock = new FishEyeBlock(); var superBlock = new SuperMediaBlock(); superBlock.Blocks.Add(fishEyeBlock); superBlock.Blocks.Add(colorEffectsBlock); superBlock.Configure(pipeline); pipeline.Connect(videoSource.Output, superBlock.Input); pipeline.Connect(superBlock.Output, videoViewBlock.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\Special\TeeBlock.md --- title: Tee block description: VisioForge Media Blocks SDK .Net - Tee block sidebar_label: Tee --- # Tee The tee block splits the video or audio data stream into multiple streams that completely copy the original stream. ## Block info Name: TeeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Any | 1 Output | Same as input | 2 or more ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TeeBlock; TeeBlock-->VideoRendererBlock; TeeBlock-->H264EncoderBlock; H264EncoderBlock-->MP4SinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoTee = new TeeBlock(2); var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); var mp4Muxer = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(fileSource.VideoOutput, videoTee.Input); pipeline.Connect(videoTee.Outputs[0], videoRenderer.Input); pipeline.Connect(videoTee.Outputs[1], h264Encoder.Input); pipeline.Connect(h264Encoder.Output, mp4Muxer.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Sample applications - [Simple Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\AV1EncoderBlock.md --- title: AV1 encoder block description: VisioForge Media Blocks SDK .Net - AV1 encoder block sidebar_label: AV1 encoder --- # AV1 encoder `AV1 (AOMedia Video 1)`: Developed by the Alliance for Open Media, AV1 is an open, royalty-free video coding format designed for video transmissions over the Internet. It is known for its high compression efficiency and better quality at lower bit rates compared to its predecessors, making it well-suited for high-resolution video streaming applications. Use classes that implement the `IAV1EncoderSettings` interface to set the parameters. ## Settings ### AOMAV1EncoderSettings AOM AV1 encoder settings. CPU encoder. **Platforms:** Windows, Linux, macOS. ### QSVAV1EncoderSettings Intel GPU AV1 video encoder. **Platforms:** Windows, Linux, macOS. ### RAV1EEncoderSettings RAV1E AV1 encoder settings. **Platforms:** Windows, Linux, macOS. ## Block info Name: AV1EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | AV1 | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AV1EncoderBlock; AV1EncoderBlock-->MP4SinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoEncoderBlock = new AV1EncoderBlock(new QSVAV1EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(h264EncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\DVEncoderBlock.md --- title: DV encoder block description: VisioForge Media Blocks SDK .Net - DV encoder block sidebar_label: DV encoder --- # DV encoder `DV (Digital Video)`: A format for storing digital video introduced in the 1990s, primarily used in consumer digital camcorders. DV employs intra-frame compression to deliver high-quality video on digital tapes, making it suitable for home videos as well as semi-professional productions. ## Block info Name: DVEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | video/x-dv | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->DVEncoderBlock; DVEncoderBlock-->AVISinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoEncoderBlock = new DVEncoderBlock(new DVVideoEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new AVISinkBlock(new AVISinkSettings(@"output.avi")); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\H264EncoderBlock.md --- title: H264 encoder block description: VisioForge Media Blocks SDK .Net - H264 encoder block sidebar_label: H264 encoder --- # H264 encoder The H264 encoder block is used for encoding files in MP4, MKV, and some other formats, as well as for network streaming using RTSP and HLS. Use classes that implement the IH264EncoderSettings interface to set the parameters. ## Settings ### NVENCH264EncoderSettings Nvidia GPUs H264 video encoder. **Platforms:** Windows, Linux, macOS. ### AMFHEVCEncoderSettings AMD/ATI GPUs H264 video encoder. **Platforms:** Windows, Linux, macOS. ### QSVH264EncoderSettings Intel GPU H264 video encoder. **Platforms:** Windows, Linux, macOS. ### OpenH264EncoderSettings Software CPU H264 encoder. **Platforms:** Windows, macOS, Linux, iOS, Android. ## Block info Name: H264EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | H264 | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; H264EncoderBlock-->MP4SinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var h264EncoderBlock = new H264EncoderBlock(new NVENCH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(h264EncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Sample applications - [Simple Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) - [Screen Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Screen%20Capture) ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\HEVCEncoderBlock.md --- title: HEVC/H265 encoder block description: VisioForge Media Blocks SDK .Net - HEVC/H265 encoder block sidebar_label: HEVC/H265 encoder --- # HEVC/H265 encoder HEVC encoder is used for encoding files in MP4, MKV, and some other formats, as well as for network streaming using RTSP and HLS. Use classes that implement the IHEVCEncoderSettings interface to set the parameters. ## Settings ### MFHEVCEncoderSettings Microsoft Media Foundation HEVC encoder. CPU encoder. **Platforms:** Windows. ### NVENCHEVCEncoderSettings Nvidia GPUs HEVC video encoder. **Platforms:** Windows, Linux, macOS. ### AMFHEVCEncoderSettings AMD/ATI GPUs HEVC video encoder. **Platforms:** Windows, Linux, macOS. ### QSVHEVCEncoderSettings Intel GPU HEVC video encoder. **Platforms:** Windows, Linux, macOS. ## Block info Name: HEVCEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | HEVC | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->HEVCEncoderBlock; HEVCEncoderBlock-->MP4SinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var hevcEncoderBlock = new HEVCEncoderBlock(new NVENCHEVCEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, hevcEncoderBlock.Input); var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(hevcEncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\index.md --- title: Video encoder blocks description: VisioForge Media Blocks SDK .Net - Video encoder blocks sidebar_label: Video encoders --- # Video encoding Video encoding is the process of converting raw video data into a compressed format. This process is essential for reducing the size of video files, making them easier to store and stream over the internet. VisioForge Media Blocks SDK provides a wide range of video encoders that support various formats and codecs. For some video encoders, SDK can use GPU acceleration to speed up the encoding process. This feature is especially useful when working with high-resolution video files or when encoding multiple videos simultaneously. NVidia, Intel, and AMD GPUs are supported for hardware acceleration. ## Video encoder blocks - [AV1 encoder](AV1EncoderBlock.md) - [DNxHD encoder](DNxHDEncoderBlock.md) - [DV encoder](DVEncoderBlock.md) - [H264 encoder](H264EncoderBlock.md) - [H265/HEVC encoder](HEVCEncoderBlock.md) - [MJPEG encoder](MJPEGEncoderBlock.md) - [Theora encoder](TheoraEncoderBlock.md) - [VP8/VP9 encoder](VPXEncoderBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\MJPEGEncoderBlock.md --- title: MJPEG encoder block description: VisioForge Media Blocks SDK .Net - MJPEG encoder block sidebar_label: MJPEG encoder --- # MJPEG encoder `MJPEG (Motion JPEG)`: A video compression format where each frame of video is separately compressed into a JPEG image. This technique is straightforward and results in no interframe compression, making it ideal for situations where frame-specific editing or access is required, such as in surveillance and medical imaging. Use classes that implement the IH264EncoderSettings interface to set the parameters. ## Settings ### MJPEGEncoderSettings Default MJPEG encoder. CPU encoder. **Platforms:** Windows, Linux, macOS, iOS, Android. ### QSVMJPEGEncoderSettings Intel GPUs MJPEG encoder. **Platforms:** Windows, Linux, macOS. ## Block info Name: MJPEGEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | MJPEG | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MJPEGEncoderBlock; MJPEGEncoderBlock-->AVISinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoEncoderBlock = new MJPEGEncoderBlock(new MJPEGEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var aviSinkBlock = new AVISinkBlock(new AVISinkSettings(@"output.avi")); pipeline.Connect(videoEncoderBlock.Output, aviSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\TheoraEncoderBlock.md --- title: Theora encoder block description: VisioForge Media Blocks SDK .Net - Theora encoder block sidebar_label: Theora encoder --- # Theora encoder The [Theora](https://www.theora.org/) encoder is used to encode video files in WebM format. ## Block info Name: TheoraEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | video/x-theora | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TheoraEncoderBlock; TheoraEncoderBlock-->WebMSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var theoraEncoderBlock = new TheoraEncoderBlock(new TheoraEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, theoraEncoderBlock.Input); var webmSinkBlock = new WebMSinkBlock(new WebMSinkSettings(@"output.webm")); pipeline.Connect(theoraEncoderBlock.Output, webmSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\VPXEncoderBlock.md --- title: VP8/VP9 encoder block description: VisioForge Media Blocks SDK .Net - VP8/VP9 encoder block sidebar_label: VP8/VP9 encoder --- # VPX encoder VPX encoder block is used for encoding files in WebM, MKV, or OGG files. VPX encoder is a set of video codecs for encoding in VP8 and VP9 formats. Use classes that implement the IVPXEncoderSettings interface to set the parameters. ## Settings ### VP8EncoderSettings VP8 CPU encoder. ### VP9EncoderSettings VP9 CPU encoder. ## Block info Name: VPXEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | VP8/VP9 | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VPXEncoderBlock; VPXEncoderBlock-->WebMSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vp8EncoderBlock = new VPXEncoderBlock(new VP8EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, vp8EncoderBlock.Input); var webmSinkBlock = new WebMSinkBlock(new WebMSinkSettings(@"output.webm")); pipeline.Connect(vp8EncoderBlock.Output, webmSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoEncoders\WMVEncoderBlock.md --- title: WMV encoder block description: VisioForge Media Blocks SDK .Net - WMV encoder block sidebar_label: WMV encoder --- # WMV encoder ## Overview WMV encoder block encodes video in WMV format. ## Block info Name: WMVEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | video/x-wmv | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WMVEncoderBlock; WMVEncoderBlock-->ASFSinkBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wmvEncoderBlock = new WMVEncoderBlock(new WMVEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, wmvEncoderBlock.Input); var asfSinkBlock = new ASFSinkBlock(new ASFSinkSettings(@"output.wmv")); pipeline.Connect(wmvEncoderBlock.Output, asfSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\ColorEffectsBlock.md --- title: Color effects block description: VisioForge Media Blocks SDK .Net - Color effects block sidebar_label: Color effects --- # Color effects The block performs basic video frame color processing: fake heat camera toning, sepia toning, invert and slightly shade to blue, cross processing toning, and yellow foreground/blue background color filter. ## Block info Name: ColorEffectsBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ColorEffectsBlock; ColorEffectsBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Sepia var colorEffects = new ColorEffectsBlock(ColorEffectsPreset.Sepia); pipeline.Connect(fileSource.VideoOutput, colorEffects.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(colorEffects.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\DeinterlaceBlock.md --- title: Deinterlace block description: VisioForge Media Blocks SDK .Net - Deinterlace block sidebar_label: Deinterlace --- # Deinterlace The block deinterlaces interlaced video frames into progressive video frames. Several methods of processing are available. Use the DeinterlaceSettings class to configure the block. ## Block info Name: DeinterlaceBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->DeinterlaceBlock; DeinterlaceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var deinterlace = new DeinterlaceBlock(new DeinterlaceSettings()); pipeline.Connect(fileSource.VideoOutput, deinterlace.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(deinterlace.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\FishEyeBlock.md --- title: Fish eye block description: VisioForge Media Blocks SDK .Net - Fish eye block sidebar_label: Fish eye --- # Fish eye The fisheye block simulates a fisheye lens by zooming on the center of the image and compressing the edges. ## Block info Name: FishEyeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->FishEyeBlock; FishEyeBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var fishEye = new FishEyeBlock(); pipeline.Connect(fileSource.VideoOutput, fishEye.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(fishEye.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\FlipRotateBlock.md --- title: Flip/Rotate block description: VisioForge Media Blocks SDK .Net - Flip/Rotate block sidebar_label: Flip/Rotate --- # Flip/Rotate The block flips and rotates the video stream. Use the VideoFlipRotateMethod enumeration to configure. ## Block info Name: FlipRotateBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->FlipRotateBlock; FlipRotateBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // 90 degree rotation var flipRotate = new FlipRotateBlock(VideoFlipRotateMethod.Method90R); pipeline.Connect(fileSource.VideoOutput, flipRotate.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(flipRotate.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\GammaBlock.md --- title: Gamma block description: VisioForge Media Blocks SDK .Net - Gamma block sidebar_label: Gamma --- # Gamma The block performs gamma correction on a video stream. ## Block info Name: GammaBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->GammaBlock; GammaBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var gamma = new GammaBlock(2.0); pipeline.Connect(fileSource.VideoOutput, gamma.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(gamma.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\GaussianBlurBlock.md --- title: Gaussian blur block description: VisioForge Media Blocks SDK .Net - Gaussian blur block sidebar_label: Gaussian blur --- # Gaussian blur The block blurs the video stream using the Gaussian function. ## Block info Name: GaussianBlurBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->GaussianBlurBlock; GaussianBlurBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var gaussianBlur = new GaussianBlurBlock(); pipeline.Connect(fileSource.VideoOutput, gaussianBlur.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(gaussianBlur.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\GrayscaleBlock.md --- title: Grayscale block description: VisioForge Media Blocks SDK .Net - Grayscale block sidebar_label: Grayscale --- # Grayscale The block processes the video stream and makes it black and white. ## Block info Name: GrayscaleBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->GrayscaleBlock; GrayscaleBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var grayscale = new GrayscaleBlock()); pipeline.Connect(fileSource.VideoOutput, grayscale.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(grayscale.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\ImageOverlayBlock.md --- title: Image overlay block description: VisioForge Media Blocks SDK .Net - Image overlay block sidebar_label: Image overlay --- # Image overlay The block overlays an image loaded from a file onto a video stream. You can set an image position and optional alpha value. 32-bit images with alpha-channel are supported. ## Block info Name: ImageOverlayBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ImageOverlayBlock; ImageOverlayBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var imageOverlay = new ImageOverlayBlock(@"logo.png"); pipeline.Connect(fileSource.VideoOutput, imageOverlay.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(imageOverlay.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\index.md --- title: Video processing blocks description: VisioForge Media Blocks SDK .Net - Video processing blocks sidebar_label: Video processing and effects --- # Video processing blocks - [Color effects](ColorEffectsBlock.md) - [Deinterlace](DeinterlaceBlock.md) - [Fish eye](FishEyeBlock.md) - [Flip/Rotate](FlipRotateBlock.md) - [Gamma](GammaBlock.md) - [Gaussian blur](GaussianBlurBlock.md) - [Image overlay](ImageOverlayBlock.md) - [Mirror](MirrorBlock.md) - [Perspective](PerspectiveBlock.md) - [Pinch](PinchBlock.md) - [Resize](VideoResizeBlock.md) - [Rotate](RotateBlock.md) - [Video sample grabber](VideoSampleGrabberBlock.md) - [Sphere](SphereBlock.md) - [Square](SquareBlock.md) - [Stretch](StretchBlock.md) - [Text overlay](TextOverlayBlock.md) - [Tunnel](TunnelBlock.md) - [Twirl](TwirlBlock.md) - [Video balance](VideoBalanceBlock.md) - [Video mixer](VideoMixerBlock.md) - [Water ripple](WaterRippleBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\MirrorBlock.md --- title: Mirror block description: VisioForge Media Blocks SDK .Net - Mirror block sidebar_label: Mirror --- # Mirror The mirror block splits the image into two halves and reflects one over the other. ## Block info Name: MirrorBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MirrorBlock; MirrorBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mirrorBlock = new MirrorBlock(MirrorMode.Top); pipeline.Connect(fileSource.VideoOutput, mirrorBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(mirrorBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\PerspectiveBlock.md --- title: Perspective block description: VisioForge Media Blocks SDK .Net - Perspective block sidebar_label: Perspective --- # Perspective The perspective block applies a 2D perspective transform. ## Block info Name: PerspectiveBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->PerspectiveBlock; PerspectiveBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var persBlock = new PerspectiveBlock(new int[] { 1, 2, 3, 4, 5, 6, 7, 8, 9 }); pipeline.Connect(fileSource.VideoOutput, persBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(persBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\PinchBlock.md --- title: Pinch block description: VisioForge Media Blocks SDK .Net - Pinch block sidebar_label: Pinch --- # Pinch The block performs the pinch geometric transform of the image. ## Block info Name: PinchBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->PinchBlock; PinchBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var pinchBlock = new PinchBlock(); pipeline.Connect(fileSource.VideoOutput, pinchBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(pinchBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\RotateBlock.md --- title: Rotate block description: VisioForge Media Blocks SDK .Net - Rotate block sidebar_label: Rotate --- # Rotate The block rotates the image by a specified angle. ## Block info Name: RotateBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->RotateBlock; RotateBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var rotateBlock = new RotateBlock(0.7); pipeline.Connect(fileSource.VideoOutput, rotateBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(rotateBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\SphereBlock.md --- title: Sphere block description: VisioForge Media Blocks SDK .Net - Sphere block sidebar_label: Sphere --- # Sphere The sphere block applies a sphere geometric transform to the video. ## Block info Name: SphereBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SphereBlock; SphereBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var sphereBlock = new SphereBlock(); pipeline.Connect(fileSource.VideoOutput, sphereBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(sphereBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\SquareBlock.md --- title: Square block description: VisioForge Media Blocks SDK .Net - Square block sidebar_label: Square --- # Square The square block distorts the center part of the video into a square. ## Block info Name: SquareBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SquareBlock; SquareBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var squareBlock = new SquareBlock(new SquareVideoEffect()); pipeline.Connect(fileSource.VideoOutput, squareBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(squareBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\StretchBlock.md --- title: Stretch block description: VisioForge Media Blocks SDK .Net - Stretch block sidebar_label: Stretch --- # Stretch The stretch block stretches the video in the circle around the center point. ## Block info Name: StretchBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->StretchBlock; StretchBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var stretchBlock = new StretchBlock(); pipeline.Connect(fileSource.VideoOutput, stretchBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(stretchBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\TextOverlayBlock.md --- title: Text overlay block description: VisioForge Media Blocks SDK .Net - Text overlay block sidebar_label: Text overlay --- # Text overlay The block adds the text overlay on top of the video stream. ## Block info Name: TextOverlayBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TextOverlayBlock; TextOverlayBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var textOverlay = new TextOverlayBlock(new TextOverlaySettings("Hello world!")); pipeline.Connect(fileSource.VideoOutput, textOverlay.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(textOverlay.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\TunnelBlock.md --- title: Tunnel block description: VisioForge Media Blocks SDK .Net - Tunnel block sidebar_label: Tunnel --- # Tunnel The block applies a light tunnel effect to a video stream. ## Block info Name: TunnelBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TunnelBlock; TunnelBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var tunnelBlock = new TunnelBlock(); pipeline.Connect(fileSource.VideoOutput, tunnelBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(tunnelBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\TwirlBlock.md --- title: Twirl block description: VisioForge Media Blocks SDK .Net - Twirl block sidebar_label: Twirl --- # Twirl The twirl block twists the video frame from the center out. ## Block info Name: TwirlBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TwirlBlock; TwirlBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var twirlBlock = new TwirlBlock(); pipeline.Connect(fileSource.VideoOutput, twirlBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(twirlBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\VideoBalanceBlock.md --- title: Video balance block description: VisioForge Media Blocks SDK .Net - Video balance block sidebar_label: Video balance --- # Video balance The block processes the video stream and allows you to change brightness, contrast, hue and saturation. Use the VideoBalanceVideoEffect class to configure the block settings. ## Block info Name: VideoBalanceBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoBalanceBlock; VideoBalanceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoBalance = new VideoBalanceBlock(new VideoBalanceVideoEffect() { Brightness = 0.25 }); pipeline.Connect(fileSource.VideoOutput, videoBalance.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoBalance.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\VideoMixerBlock.md --- title: Video mixer block description: VisioForge Media Blocks SDK .Net - Video mixer block sidebar_label: Video mixer --- # Video mixer The video mixer block has several inputs and one output. The block draws the inputs in the selected order at the selected positions. You can also set the desired level of transparency for each stream. ## Block info Name: VideoMixerBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 or more Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock#1-->VideoMixerBlock; UniversalSourceBlock#2-->VideoMixerBlock; VideoMixerBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename1 = "test.mp4"; var fileSource1 = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename1))); var filename2 = "test2.mp4"; var fileSource2 = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename2))); var mixerSettings = new VideoMixerSettings(); mixerSettings.AddStream(new VideoMixerStream(new Rect(0, 0, 1280, 720), 0)); mixerSettings.AddStream(new VideoMixerStream(new Rect(100, 100, 420, 340), 1)); var videoMixer = new VideoMixerBlock(mixerSettings); pipeline.Connect(fileSource1.VideoOutput, videoMixer.Inputs[0]); pipeline.Connect(fileSource2.VideoOutput, videoMixer.Inputs[1]); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoMixer.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\VideoResizeBlock.md --- title: Video resize block description: VisioForge Media Blocks SDK .Net - Video resize block sidebar_label: Video resize --- # Video resize The block resizes the video stream. You can configure the resize method to use, the letterbox flag and many other options. Use the ResizeVideoEffect class to configure. ## Block info Name: VideoResizeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoResizeBlock; VideoResizeBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoResize = new VideoResizeBlock(new ResizeVideoEffect(1280, 720) { Letterbox = false }); pipeline.Connect(fileSource.VideoOutput, videoResize.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoResize.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\VideoSampleGrabberBlock.md --- title: Video sample grabber block description: VisioForge Media Blocks SDK .Net - Video sample grabber block sidebar_label: Video sample grabber --- # Video sample grabber The video sample grabber calls an event for each video frame. You can save or process the received video frame. ## Block info Name: VideoSampleGrabberBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoSampleGrabberBlock; VideoSampleGrabberBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoSG = new VideoSampleGrabberBlock(); videoSG.OnVideoFrameBuffer += VideoSG_OnVideoFrameBuffer; pipeline.Connect(fileSource.VideoOutput, videoSG.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoSG.Output, videoRenderer.Input); await pipeline.StartAsync(); private void VideoSG_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { // save or process the video frame } ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoProcessing\WaterRippleBlock.md --- title: Water ripple block description: VisioForge Media Blocks SDK .Net - Water ripple block sidebar_label: Water ripple --- # Water ripple The water ripple block creates a water ripple effect on the video stream. Use the WaterRippleVideoEffect class to configure. ## Block info Name: WaterRippleBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WaterRippleBlock; WaterRippleBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wrBlock = new WaterRippleBlock(new WaterRippleVideoEffect()); pipeline.Connect(fileSource.VideoOutput, wrBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(wrBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoRendering\index.md --- title: Video rendering blocks description: VisioForge Media Blocks SDK .Net - Video rendering blocks sidebar_label: Video rendering --- # Video rendering - [Video renderer block](VideoRendererBlock.md) ---END OF PAGE--- # Local File: .\codebase\VisioForge.Core\MediaBlocks\VideoRendering\VideoRendererBlock.md --- title: Video renderer block description: VisioForge Media Blocks SDK .Net - Video renderer block sidebar_label: Video renderer --- # Video renderer The Video Renderer block is used to play the audio stream on the selected or default device. Volume and mute options are available. ## Block info Name: VideoRendererBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | uncompressed video | one or more | ## Video view A special platform-specific visual control, `VideoView` is used for rendering. For Windows, it will use DirectX. In most cases, SDK will use OpenGL for other platforms. ## The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoRendererBlock; ``` ## Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(fileSource.VideoOutput, videoRenderer.Input); await pipeline.StartAsync(); ``` ## Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Android\MediaPlayer\readme.es.md # Media Blocks SDK .Net - Android Simple Player Demo Explora las capacidades de la tecnología Media Blocks de VisioForge con nuestra muestra SDK para Android, diseñada para demostrar las funciones avanzadas de reproducción multimedia. Esta aplicación de ejemplo muestra la perfecta integración de la reproducción de vídeo y audio, la gestión dinámica de fuentes y los controles de interacción con el usuario en tiempo real. Los desarrolladores pueden navegar fácilmente a través de ejemplos de código para seleccionar archivos multimedia, manejar controles de reproducción e implementar notificaciones basadas en eventos para una comprensión completa del potencial del SDK. Perfecto para quienes buscan mejorar sus aplicaciones Android con sólidas capacidades de procesamiento multimedia. ## Características - Reproducir archivos multimedia - Reproducir secuencias de red - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Android\MediaPlayer\readme.md # Media Blocks SDK .Net - Android Simple Player Demo Explore the capabilities of VisioForge's Media Blocks technology with our Android SDK sample, designed to demonstrate advanced media playback features. This sample app showcases seamless integration of video and audio rendering, dynamic source management, and real-time user interaction controls. Developers can easily navigate through code examples for picking media files, handling playback controls, and implementing event-based notifications for a comprehensive understanding of the SDK's potential. Perfect for those looking to enhance their Android applications with robust media processing capabilities. ## Features - Play media files - Play network streams - Seeking ## Used blocks - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Android\RTSP Client\readme.es.md # Media Blocks SDK .Net - RTSP Client Demo Este ejemplo del SDK muestra cómo crear un cliente RTSP para Android utilizando el SDK VisioForge Media Blocks. La aplicación muestra la capacidad de transmitir vídeo desde una URL RTSP, proporcionando funciones para iniciar, pausar y detener la reproducción. Integra los Media Blocks de VisioForge para el renderizado de vídeo y el manejo de la fuente RTSP dentro de una actividad Android, utilizando una interfaz de usuario que incluye botones para el control y campos de texto para URL, login y entrada de contraseña. Además, gestiona los permisos para la cámara, Internet y la grabación de audio, asegurando que la aplicación tiene el acceso necesario para sus operaciones. ## Características - Reproducción de secuencias RTSP ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Android\RTSP Client\readme.md # Media Blocks SDK .Net - RTSP Client Demo This SDK sample demonstrates how to build an RTSP client for Android using the VisioForge Media Blocks SDK. The application showcases the ability to stream video from an RTSP URL, providing features to start, pause, and stop playback. It integrates VisioForge's Media Blocks for video rendering and RTSP source handling within an Android activity, using a user interface that includes buttons for control and text fields for URL, login, and password input. Additionally, it handles permissions for camera, internet, and audio recording, ensuring the app has the necessary access for its operations. ## Features - Play RTSP streams ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [RTSPSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/RTSPSourceBlock/) - captures video from an RTSP stream ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Android\Simple Video Capture\readme.es.md # Media Blocks SDK .Net - Simple Video Capture Demo Este ejemplo del SDK muestra cómo crear una sencilla aplicación de captura de vídeo en Android utilizando el SDK VisioForge Media Blocks. Incluye la configuración de fuentes de vídeo y audio desde la cámara y el micrófono del dispositivo, respectivamente, y la representación de vistas previas de vídeo en directo. Además, muestra la capacidad de iniciar y detener la grabación, cambiar entre diferentes cámaras y gestionar los permisos de forma dinámica. La aplicación aprovecha varios bloques del framework VisioForge, como codificadores de vídeo y audio, un multiplexor para crear archivos MP4 y bloques multimedia especializados para el procesamiento de audio y vídeo, la renderización y la gestión de fuentes. ## Características - Vista previa del vídeo de la cámara - Captura de vídeo y audio en un archivo MP4 ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Android\Simple Video Capture\readme.md # Media Blocks SDK .Net - Simple Video Capture Demo This SDK sample demonstrates how to create a simple video capture application on Android using the VisioForge Media Blocks SDK. It includes the setup of video and audio sources from the device's camera and microphone, respectively, and the rendering of live video previews. Additionally, it showcases the capability to start and stop recording, switch between different cameras, and handle permissions dynamically. The application leverages various blocks from the VisioForge framework, such as encoders for video and audio, a multiplexer for creating MP4 files, and specialized media blocks for audio and video processing, rendering, and source management. ## Features - Preview camera video - Capture video and audio to MP4 file ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Avalonia\Simple Player\readme.es.md # Media Blocks SDK .Net - Avalonia Simple Player Demo El fragmento de código proporcionado esboza la implementación de un sencillo reproductor multimedia utilizando el framework Avalonia UI y las capacidades de manejo multimedia de VisioForge. Esta aplicación muestra la inicialización, configuración y control de la reproducción multimedia, incluyendo secuencias de vídeo y audio, a través de una interfaz de usuario sencilla. Muestra funciones clave como la selección de un archivo multimedia, su reproducción, pausa, reanudación y detención, así como el ajuste del volumen y la velocidad de reproducción. La aplicación aprovecha el SDK VisioForge Media Blocks para las operaciones multimedia, incluida la renderización de audio y vídeo, el manejo de fuentes multimedia y la gestión de líneas de tiempo de reproducción, proporcionando un ejemplo completo de integración de funcionalidades multimedia complejas en una aplicación basada en Avalonia. ## Características - Reproducir archivos multimedia - Reproducir secuencias de red - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Avalonia\Simple Player\readme.md # Media Blocks SDK .Net - Avalonia Simple Player Demo The provided code snippet outlines the implementation of a simple media player using Avalonia UI framework and VisioForge's media handling capabilities. This application demonstrates the initialization, configuration, and control of media playback, including video and audio streams, through a user-friendly interface. It showcases key functionalities such as selecting a media file, playing, pausing, resuming, and stopping the media, as well as adjusting volume and playback speed. The application leverages the VisioForge Media Blocks SDK for media operations, including rendering audio and video, handling media sources, and managing playback timelines, providing a comprehensive example of integrating complex media functionalities within an Avalonia-based application. ## Features - Play media files - Play network streams - Seeking ## Used blocks - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Avalonia\Simple Video Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo Avalonia (C#/AvaloniaUI) Este ejemplo de SDK demuestra la integración del SDK de Captura de Vídeo VisioForge .Net con una aplicación GUI basada en Avalonia para capturar secuencias de vídeo y audio. Muestra la instalación y configuración de dispositivos de entrada de vídeo y audio, la selección de formatos de entrada y velocidades de fotogramas, y la gestión de eventos de dispositivos. La aplicación también incluye efectos de vídeo en tiempo real, ajuste del volumen de audio, controles de grabación (inicio, pausa, reanudación, parada) y función de instantáneas. Aprovecha las capacidades del SDK de VisioForge para la captura, el procesamiento y la renderización de vídeo dentro de un marco de interfaz de usuario Avalonia multiplataforma, proporcionando un ejemplo completo para los desarrolladores que deseen implementar funcionalidades de captura y procesamiento de medios en sus aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Avalonia\Simple Video Capture\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo Avalonia (C#/AvaloniaUI) This SDK sample demonstrates the integration of the VisioForge Video Capture SDK .Net with an Avalonia-based GUI application for capturing video and audio streams. It showcases the setup and configuration of video and audio input devices, the selection of input formats and frame rates, and the management of device events. The application also features real-time video effects, audio volume adjustment, recording controls (start, pause, resume, stop), and snapshot functionality. It leverages the VisioForge SDK's capabilities for video capture, processing, and rendering within a cross-platform Avalonia UI framework, providing a comprehensive example for developers looking to implement media capture and processing functionalities in their .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Console\HLS Streamer\readme.es.md # Media Blocks SDK .Net - HLS Streaming Demo La aplicación HLS Streamer, creada con el SDK Media Blocks de VisioForge, muestra una forma sencilla de transmitir contenidos de vídeo y audio a través del protocolo HLS (HTTP Live Streaming). Utiliza codificadores H264 y AAC para la compresión de vídeo y audio con el fin de crear un canal de streaming virtual que envía los contenidos a un sumidero HLS. La aplicación sirve el contenido en streaming en un servidor HTTP local accesible a través de http://localhost:8088/, demostrando la integración de fuentes de vídeo y audio, codificadores y salida de streaming en un entorno .NET. Este ejemplo es ideal para desarrolladores que deseen implementar el streaming HLS en sus aplicaciones, ya que proporciona una plantilla para el procesamiento y streaming de medios de origen a destino. ## Características - Streaming HLS ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Console\HLS Streamer\readme.md # Media Blocks SDK .Net - HLS Streaming Demo The HLS Streamer app, built using VisioForge's Media Blocks SDK, showcases a straightforward way to stream video and audio content via the HLS (HTTP Live Streaming) protocol. It uses H264 and AAC encoders for video and audio compression to create a virtual streaming pipeline that outputs the media to an HLS sink. The application serves the streamed content on a local HTTP server accessible through http://localhost:8088/, demonstrating the integration of video and audio sources, encoders, and streaming output within a .NET environment. This example is ideal for developers looking to implement HLS streaming in their applications, providing a template for source-to-sink media processing and streaming. ## Features - HLS streaming ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [HLSSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/HLSSinkBlock/) - streams video using the HLS protocol - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes a video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes an audio stream using AAC - [VirtualVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/VirtualVideoSourceBlock/) - generates a video stream - [VirtualAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/VirtualAudioSourceBlock/) - generates an audio stream ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Console\Media Info CLI\readme.es.md # Media Blocks SDK .Net - Media Info CLI demo El ejemplo del SDK Media Info CLI muestra una sencilla aplicación de consola que utiliza la biblioteca VisioForge MediaInfoReaderX para leer y mostrar información sobre archivos multimedia. El programa espera una ruta de archivo como argumento de entrada e imprime los detalles del archivo multimedia en la consola. Si no se proporciona ningún argumento, pide al usuario que especifique un archivo de entrada. La aplicación muestra la gestión básica de errores al notificar al usuario cuando no se puede leer el archivo multimedia especificado. Este ejemplo sirve como guía de inicio rápido para los desarrolladores que deseen integrar funciones de análisis de archivos multimedia en sus aplicaciones .NET. ## Características - Obtener información sobre archivos multimedia - Obtener información sobre flujos de red ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Console\Media Info CLI\readme.md # Media Blocks SDK .Net - Media Info CLI demo The Media Info CLI SDK sample demonstrates a simple console application that uses the VisioForge MediaInfoReaderX library to read and display media file information. The program expects a file path as an input argument and prints the media file's details to the console. If no argument is provided, it prompts the user to specify an input file. The application showcases basic error handling by notifying the user when the specified media file cannot be read. This example serves as a quick start guide for developers looking to integrate media file analysis functionalities into their .NET applications. ## Features - Get information about media files - Get information about network streams ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Console\RTSPView\readme.es.md # Media Blocks SDK .Net - RTSP View CLI demo Este ejemplo del SDK muestra cómo crear un sencillo visor RTSP utilizando la Media Blocks API de VisioForge en C#. Inicializa un canal de medios, configura una fuente RTSP con autenticación de usuario y muestra el flujo de vídeo. El programa acepta tres argumentos de línea de comandos para la URL del flujo RTSP, el nombre de usuario y la contraseña. El soporte de audio es opcional y puede activarse descomentando las secciones pertinentes. La aplicación muestra el manejo de errores y la gestión limpia de recursos con una interfaz sencilla para detener el flujo y deshacerse de la tubería. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Console\RTSPView\readme.md # Media Blocks SDK .Net - RTSP View CLI demo This SDK sample demonstrates how to build a simple RTSP viewer using VisioForge's Media Blocks API in C#. It initializes a media pipeline, sets up an RTSP source with user authentication, and renders the video stream. The program accepts three command-line arguments for the RTSP stream URL, username, and password. Audio support is optional and can be enabled by uncommenting the relevant sections. The application showcases error handling and clean resource management with a straightforward interface for stopping the stream and disposing of the pipeline. ## Used blocks - [RTSPSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/RTSPSourceBlock/) - captures video from an RTSP source - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Console\RTSPViewCV\readme.es.md # Media Blocks SDK .Net - RTSP View CV CLI demo Este ejemplo del SDK muestra cómo crear una aplicación de detección de rostros en tiempo real utilizando los bloques multimedia de VisioForge. El programa inicializa un detector de rostros y configura un canal de medios para procesar vídeo de un flujo RTSP. Muestra cómo configurar una fuente RTSP, un renderizador de vídeo y un bloque grabador de muestra para capturar fotogramas de vídeo para la detección de rostros. Los usuarios pueden iniciar la aplicación con argumentos de línea de comandos especificando la URL del flujo RTSP, el nombre de usuario y la contraseña. La muestra también incluye controladores de eventos para caras detectadas y errores de canalización, proporcionando un ejemplo completo de integración de procesamiento de vídeo en tiempo real y detección de caras en aplicaciones .NET. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\Console\RTSPViewCV\readme.md # Media Blocks SDK .Net - RTSP View CV CLI demo This SDK sample demonstrates how to create a real-time face detection application using the VisioForge Media Blocks. The program initializes a face detector and configures a media pipeline to process video from an RTSP stream. It showcases how to set up an RTSP source, a video renderer, and a sample grabber block to capture video frames for face detection. Users can start the application with command-line arguments specifying the RTSP stream URL, username, and password. The sample also includes event handlers for detected faces and pipeline errors, providing a comprehensive example of integrating real-time video processing and face detection in .NET applications. ## Used blocks - [RTSPSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/RTSPSourceBlock/) - captures video from an RTSP source - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [VideoSampleGrabberBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoProcessing/VideoSampleGrabberBlock/) - captures video frames for processing ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\iOS\SimpleVideoCapture\readme.es.md # Media Blocks SDK .Net - iOS Simple Video Capture Demo Este ejemplo muestra la implementación de una sencilla aplicación de captura y procesamiento de vídeo utilizando el SDK VisioForge Media Blocks. Muestra cómo enumerar fuentes de vídeo, capturar vídeo y audio de una cámara y un micrófono seleccionados, aplicar efectos de vídeo como escala de grises, renderizar el vídeo en pantalla y, opcionalmente, codificar y guardar el vídeo en un archivo. El código también incluye funciones para cambiar de cámara, detener la captura y guardar el vídeo capturado en la fototeca de iOS. Se utilizan funciones avanzadas como capturadores de muestras de audio y vídeo para procesar fotogramas, y se añaden elementos de interfaz de usuario personalizados para controlar el proceso de captura. La aplicación aprovecha la arquitectura MediaBlocks del SDK VisioForge para el procesamiento modular de medios, mostrando un ejemplo práctico de captura y manipulación de vídeo en tiempo real en dispositivos iOS. ## Características - Vista previa del vídeo de la cámara - Captura de vídeo y audio a un archivo MP4 - Añadir efectos de vídeo de muestra - Cambio entre cámaras - Añadir grabadores de muestra para audio y vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\iOS\SimpleVideoCapture\readme.md # Media Blocks SDK .Net - iOS Simple Video Capture Demo This sample demonstrates the implementation of a simple video capture and processing application using the VisioForge Media Blocks SDK. It showcases how to enumerate video sources, capture video and audio from a selected camera and microphone, apply video effects like grayscale, render the video on-screen, and optionally encode and save the video to a file. The code also includes functionality for switching between cameras, stopping the capture, and saving the captured video to the iOS photo library. Advanced features such as audio and video sample grabbers are utilized to process frames, and custom UI elements are added to control the capture process. The application leverages the VisioForge SDK's MediaBlocks architecture for modular media processing, demonstrating a practical example of real-time video capture and manipulation on iOS devices. ## Features - Preview camera video - Capture video and audio to MP4 file - Add sample video effects - Switch between cameras - Add sample grabbers for audio and video ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\macOS\GenICam Viewer\readme.es.md # Media Blocks SDK .Net - GenICam Source Demo (macOS) GenICam Source Demo es una aplicación que utiliza el Media Blocks SDK .Net para previsualizar o capturar video. Funciona con cámaras que admiten el protocolo GenICam y están conectadas a través de USB 3 o Ethernet Gigabit (GigE). ## Características - Reproducción de vídeo desde la fuente GenICam ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\macOS\GenICam Viewer\readme.md # Media Blocks SDK .Net - GenICam Source Demo (macOS) GenICam Source Demo is an application that leverages the Media Blocks SDK .Net for previewing or capturing video. It works with cameras that support the GenICam protocol and are connected via USB 3 or Gigabit Ethernet (GigE). ## Features - Play video from GenICam source ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [GenICamSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/GenICamSourceBlock/) - captures video from GenICam source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\macOS\ScreenCaptureMB\readme.es.md # Media Blocks SDK .Net - Screen Capture Demo (macOS) Esta muestra de SDK ilustra cómo crear una aplicación de captura y grabación de pantalla utilizando Media Blocks SDK .Net dentro de un marco de trabajo WPF. La aplicación demuestra cómo configurar un pipeline de bloques de medios para capturar contenido de pantalla y audio del sistema, mostrándolos y codificándolos en un archivo. Destaca la integración de bloques de fuente de pantalla y audio, bloques de renderizado de video y audio, y bloques de codificación para video H264 y audio AAC, culminando en el guardado del resultado final como un archivo MP4. Además, la muestra incluye opciones para seleccionar dispositivos de entrada y salida de audio mediante enumeración de dispositivos, incorpora mecanismos de manejo de errores y proporciona la capacidad de alternar entre modos de vista previa y grabación. ## Características - Captura de vídeo de la pantalla a un archivo MP4 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\macOS\ScreenCaptureMB\readme.md # Media Blocks SDK .Net - Screen Capture Demo (macOS) This SDK sample illustrates how to create a screen capture and recording application using the VisioForge Media Blocks SDK .Net within a WPF framework. The application demonstrates how to configure a media block pipeline to capture screen content and system audio, displaying and encoding them into a file. It features the integration of screen and audio source blocks, video and audio renderer blocks, and encoding blocks for H264 video and AAC audio, culminating in the saving of the final output as an MP4 file. Furthermore, the sample includes options for selecting audio input and output devices through device enumeration, incorporates error handling mechanisms, and provides the ability to switch between preview and recording modes. ## Features - Capture video from screen to MP4 file - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [ScreenSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/ScreenSourceBlock/) - captures video from the screen - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\macOS\SimpleMediaPlayerMBMac\readme.es.md # Media Blocks SDK .Net - macOS Simple Player Demo El ejemplo SimpleMediaPlayerMBMac SDK demuestra la integración del SDK Media Blocks de VisioForge para crear una aplicación de reproducción multimedia en macOS. Muestra cómo inicializar y gestionar un canal de reproducción multimedia, incluyendo la renderización de vídeo y audio, utilizando `MediaBlocksPipeline`, `VideoRendererBlock` y `AudioRendererBlock`. La aplicación admite la carga y reproducción de varios formatos multimedia, la actualización de una posición de reproducción con un control deslizante y la visualización de vídeo dentro de una vista OpenGL personalizada. También se implementan interacciones esenciales de interfaz de usuario para iniciar, detener y abrir archivos multimedia, demostrando el manejo de tareas asíncronas y actualizaciones de interfaz de usuario en macOS. ## Características - Reproducción de archivos multimedia - Reproducción de flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\macOS\SimpleMediaPlayerMBMac\readme.md # Media Blocks SDK .Net - macOS Simple Player Demo The SimpleMediaPlayerMBMac SDK sample demonstrates the integration of VisioForge's Media Blocks SDK for creating a media player application on macOS. It showcases how to initialize and manage a media playback pipeline, including video and audio rendering, using `MediaBlocksPipeline`, `VideoRendererBlock`, and `AudioRendererBlock`. The application supports loading and playing various media formats, updating a playback position with a slider, and displaying video within a custom OpenGL view. Essential UI interactions for starting, stopping, and opening media files are also implemented, demonstrating asynchronous task handling and UI updates on macOS. ## Features - Play media files - Play network streams - Seeking ## Used blocks - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\macOS\SimpleVideoCaptureMBMac\readme.es.md # Media Blocks SDK .Net - macOS Simple Video Capture Demo El código proporcionado es un ejemplo para crear una sencilla aplicación de captura de vídeo utilizando el SDK VisioForge Media Blocks en macOS. Demuestra cómo configurar una canalización de medios para capturar vídeo y audio desde dispositivos del sistema, renderizarlos en tiempo real y gestionar los permisos y selecciones de dispositivos. Entre las principales funciones se incluyen la solicitud de acceso a la cámara, la enumeración de fuentes de vídeo y audio, la selección de formatos y velocidades de fotogramas y la integración con la interfaz de usuario de macOS para la visualización de vídeo. El código aprovecha la potencia de la programación asíncrona para gestionar las operaciones del dispositivo y actualiza la interfaz de usuario en función del estado actual de la captura. Este ejemplo sirve como base para el desarrollo de aplicaciones multimedia más complejas en macOS utilizando el SDK MediaBlocks de VisioForge. ## Características - Vista previa del vídeo de la cámara - Captura de vídeo y audio a un archivo MP4 ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\macOS\SimpleVideoCaptureMBMac\readme.md # Media Blocks SDK .Net - macOS Simple Video Capture Demo The provided code is a sample for creating a simple video capture application using the VisioForge Media Blocks SDK on macOS. It demonstrates how to set up a media pipeline for capturing video and audio from system devices, render them in real-time, and manage device permissions and selections. Key features include requesting camera access, enumerating video and audio sources, selecting formats and frame rates, and integrating with the macOS UI for displaying video. The code leverages the power of asynchronous programming to handle device operations and updates the UI based on the current capture state. This sample serves as a foundation for developing more complex media applications on macOS using VisioForge's MediaBlocks SDK. ## Features - Preview camera video - Capture video and audio to MP4 file ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\MAUI\SimpleCapture\readme.es.md # Media Blocks SDK .Net - MAUI Simple Video Capture Demo Este ejemplo del SDK muestra cómo crear una aplicación de captura multimedia sencilla MAUI multiplataforma utilizando el SDK .Net de VisioForge Media Blocks. La aplicación es capaz de capturar vídeo y audio desde dispositivos del sistema, codificarlos en tiempo real y guardar el resultado en un archivo MP4. Incluye selección de dispositivos para cámaras de vídeo, micrófonos y dispositivos de salida de audio, junto con controles básicos para iniciar y detener los procesos de previsualización y captura. Además, gestiona las solicitudes de permisos de acceso a la cámara y el micrófono, garantizando el cumplimiento de los requisitos de privacidad específicos de cada plataforma. El uso de una canalización de bloques de medios facilita la configuración flexible y la gestión dinámica de fuentes de medios, renderizadores, codificadores y sumideros dentro de la aplicación. ## Características - Previsualización de vídeo de cámara - Captura de vídeo y audio en archivos MP4 ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\MAUI\SimpleCapture\readme.md # Media Blocks SDK .Net - MAUI Simple Video Capture Demo This SDK sample demonstrates how to create a cross-platform MAUI simple media capture application using the VisioForge Media Blocks SDK .Net. The application is capable of capturing video and audio from system devices, encoding them in real-time, and saving the output to an MP4 file. It features device selection for video cameras, microphones, and audio output devices, along with basic controls for starting and stopping the preview and capture processes. Additionally, it handles permission requests for camera and microphone access, ensuring compliance with platform-specific privacy requirements. The use of a media block pipeline facilitates the flexible configuration and dynamic management of media sources, renderers, encoders, and sinks within the application. ## Features - Preview camera video - Capture video and audio to MP4 file ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file - [OPUSEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/OPUSEncoderBlock/) - encodes the audio stream using OPUS ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\MAUI\SimplePlayer\readme.es.md # Media Blocks SDK .Net - MAUI Simple Player Demo El proyecto muestra una aplicación de reproducción multimedia multiplataforma creada utilizando el SDK VisioForge Media Blocks, orientado al framework MAUI. Demuestra la configuración y el uso de un canal de reproducción multimedia, incluida la creación de bloques de origen, renderizador de vídeo y renderizador de audio. La aplicación maneja funciones básicas de control de medios como reproducir, pausar, detener y ajustar la velocidad de reproducción, además de mostrar la posición y duración de los medios. También cuenta con controles de interfaz de usuario para seleccionar archivos multimedia, ajustar el volumen y buscar a través de los medios de comunicación. Este ejemplo está diseñado para la compatibilidad entre plataformas, con ajustes específicos para la ruta multimedia por defecto de Android. ## Características - Reproducir archivos multimedia - Reproducir flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\MAUI\SimplePlayer\readme.md # Media Blocks SDK .Net - MAUI Simple Player Demo The project showcases a cross-platform media player application built using the VisioForge Media Blocks SDK, targeting the MAUI framework. It demonstrates the setup and use of a media playback pipeline, including the creation of source, video renderer, and audio renderer blocks. The application handles basic media control functionalities such as play, pause, stop, and adjust playback speed, alongside displaying media position and duration. It also features UI controls for selecting media files, adjusting volume, and seeking through the media. This example is designed for cross-platform compatibility, with specific adjustments for Android's default media path. ## Features - Play media files - Play network streams - Seeking ## Used blocks - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\MAUI\VR360 Player\readme.es.md # Media Blocks SDK .Net - MAUI Simple Player Demo El proyecto muestra una aplicación de reproducción multimedia multiplataforma creada utilizando el SDK VisioForge Media Blocks, orientado al framework MAUI. Demuestra la configuración y el uso de un canal de reproducción multimedia, incluida la creación de bloques de origen, renderizador de vídeo y renderizador de audio. La aplicación maneja funciones básicas de control de medios como reproducir, pausar, detener y ajustar la velocidad de reproducción, además de mostrar la posición y duración de los medios. También cuenta con controles de interfaz de usuario para seleccionar archivos multimedia, ajustar el volumen y buscar a través de los medios de comunicación. Este ejemplo está diseñado para la compatibilidad entre plataformas, con ajustes específicos para la ruta multimedia por defecto de Android. ## Características - Reproducir archivos multimedia - Reproducir flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\MAUI\VR360 Player\readme.md # Media Blocks SDK .Net - MAUI Simple Player Demo The project showcases a cross-platform media player application built using the VisioForge Media Blocks SDK, targeting the MAUI framework. It demonstrates the setup and use of a media playback pipeline, including the creation of source, video renderer, and audio renderer blocks. The application handles basic media control functionalities such as play, pause, stop, and adjust playback speed, alongside displaying media position and duration. It also features UI controls for selecting media files, adjusting volume, and seeking through the media. This example is designed for cross-platform compatibility, with specific adjustments for Android's default media path. ## Features - Play media files - Play network streams - Seeking ## Used blocks - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Karaoke Demo\readme.es.md # VisioForge Media Player SDK .Net - Karaoke demo (C#/WinForms) El ejemplo Karaoke_Demo SDK muestra cómo crear una aplicación de karaoke utilizando el SDK Media Blocks de VisioForge en una aplicación Windows Forms. Este ejemplo inicializa el SDK, configura un canal de medios para la reproducción de karaoke, incluido el soporte de archivos CD+G para la visualización de letras, y controla los dispositivos de salida de audio. Los usuarios pueden cargar pistas de audio, ajustar el volumen y navegar por la canción con una línea de tiempo. También gestiona los eventos de reproducción multimedia, como el inicio, la parada y la gestión de errores, proporcionando una base sólida para el desarrollo de software de karaoke con todas las funciones. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Karaoke Demo\readme.md # VisioForge Media Player SDK .Net - Karaoke demo (C#/WinForms) The Karaoke_Demo SDK sample demonstrates how to build a karaoke application using VisioForge's Media Blocks SDK in a Windows Forms application. This sample initializes the SDK, sets up a media pipeline for karaoke playback, including CD+G file support for lyrics display, and controls audio output devices. Users can load audio tracks, adjust the volume, and navigate through the song with a timeline. It also handles media playback events such as start, stop, and error handling, providing a robust foundation for developing full-featured karaoke software. ## Used blocks - `CDGSourceBlock` - reads and decodes CD+G files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\RTSP MultiView Demo\readme.es.md # Media Blocks SDK .Net - RTSP MultiView Demo (WinForms) Este ejemplo del SDK muestra cómo crear una aplicación de streaming RTSP multivista utilizando el SDK VisioForge Media Blocks. La aplicación admite la reproducción y grabación en tiempo real de secuencias RTSP, con la posibilidad de seleccionar diferentes fuentes de cámara y ajustar parámetros como la URL, las credenciales de inicio de sesión y si se utiliza decodificación por hardware o software. Los usuarios también pueden personalizar las opciones de descodificación de la GPU, alternar la reproducción de audio y registrar la recepción de fotogramas de vídeo o audio. La interfaz de usuario ofrece controles para iniciar y detener la reproducción y grabación de secuencias, con opciones para recodificar el audio y elegir los formatos de salida. Además, la muestra incluye funciones para leer información multimedia y descubrir dispositivos ONVIF, lo que ilustra la versatilidad del SDK para gestionar diversas tareas de procesamiento multimedia. ## Características - Reproducción de múltiples secuencias RTSP - Captura de secuencias originales en disco - Captura en disco de secuencias recodificadas - Acceso a fotogramas de vídeo y audio RAW ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\RTSP MultiView Demo\readme.md # Media Blocks SDK .Net - RTSP MultiView Demo (WinForms) This SDK sample demonstrates how to create a multi-view RTSP streaming application using the VisioForge Media Blocks SDK. The application supports real-time playback and recording of RTSP streams, with the ability to select different camera feeds and adjust settings such as URL, login credentials, and whether to use hardware or software decoding. Users can also customize GPU decoding options, toggle audio playback, and log video or audio frame reception. The UI provides controls for starting and stopping stream playback and recording, with options for re-encoding audio and choosing output formats. Additionally, the sample includes features for reading media information and discovering ONVIF devices, illustrating the SDK's versatility in handling various media processing tasks. ## Features - Play multiple RTSP streams - Capture original streams to disk - Capture reencoded streams to disk - RAW video and audio frames access ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video frames - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio frames - [RTSPSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/RTSPSourceBlock/) - reads RTSP streams ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Simple Player Demo\readme.es.md # Media Blocks SDK .Net - Simple Player Demo (WinForms) El proyecto de ejemplo demuestra las capacidades del SDK VisioForge Media Blocks para crear una aplicación de reproducción multimedia en C#. Muestra cómo inicializar el canal multimedia, gestionar archivos multimedia y realizar operaciones como reproducir, pausar, reanudar y detener la reproducción multimedia. La aplicación incluye funciones como el renderizado de vídeo y audio, el ajuste de la velocidad de reproducción y el volumen, y la navegación por la línea de tiempo multimedia. También proporciona herramientas de depuración y mecanismos de gestión de errores para garantizar una experiencia de usuario fluida. Este ejemplo es una guía completa para los desarrolladores que deseen integrar funciones de reproducción multimedia en sus aplicaciones .NET utilizando el potente SDK de VisioForge. ## Características - Reproducción de archivos multimedia - Reproducir flujos de red ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Simple Player Demo\readme.md # Media Blocks SDK .Net - Simple Player Demo (WinForms) The sample project demonstrates the capabilities of the VisioForge Media Blocks SDK for creating a multimedia player application in C#. It showcases how to initialize the media pipeline, handle media files, and perform operations like play, pause, resume, and stop media playback. The application includes features such as video and audio rendering, adjusting playback speed and volume, and navigating through the media timeline. It also provides debugging tools and error-handling mechanisms to ensure a smooth user experience. This example is a comprehensive guide for developers looking to integrate media playback functionalities into their .NET applications using VisioForge's powerful SDK. ## Features - Play media files - Play network streams ## Used blocks - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Simple Video Capture Demo\readme.es.md # Media Blocks SDK .Net - Simple Video Capture Demo (WinForms) El código proporcionado es un ejemplo completo de cómo crear una sencilla aplicación de captura de vídeo utilizando el SDK VisioForge Media Blocks. Esta aplicación inicializa el SDK, enumera los dispositivos de vídeo y audio, y permite al usuario seleccionar las fuentes de entrada y configurar sus ajustes. Incluye una interfaz gráfica de usuario para la selección y configuración de dispositivos, la captura de vídeo y audio en tiempo real, la codificación en formatos H.264/AAC y la multiplexación en un contenedor MP4. El código también demuestra el manejo del renderizado de vídeo y audio, así como el uso de sample grabbers para la manipulación de fotogramas. Este ejemplo está diseñado para mostrar las capacidades del SDK en la creación de aplicaciones multimedia con flujos de trabajo de captura y procesamiento personalizados. ## Características - Captura de vídeo desde cámaras web a archivos MP4 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Simple Video Capture Demo\readme.md # Media Blocks SDK .Net - Simple Video Capture Demo (WinForms) The provided code is a comprehensive example of how to create a simple video capture application using the VisioForge Media Blocks SDK. This application initializes the SDK, enumerates video and audio devices, and allows the user to select input sources and configure their settings. It features a GUI for device selection and configuration, real-time video and audio capture, encoding to H.264/AAC formats, and multiplexing into an MP4 container. The code also demonstrates handling of video and audio rendering, as well as using sample grabbers for frame manipulation. This sample is designed to showcase the SDK's capabilities in building multimedia applications with custom capture and processing workflows. ## Features - Capture video from webcams to MP4 file - Video preview - Video and audio sample grabbers ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file - [VideoSampleGrabberBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoProcessing/VideoSampleGrabberBlock/) - grab video frames - [AudioSampleGrabberBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioProcessing/AudioSampleGrabberBlock/) - grab audio frames ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Stream Player Demo\readme.es.md # Media Blocks SDK .Net - Memory Player Demo (WinForms) El ejemplo muestra el uso de VisioForge Media Blocks SDK para crear un reproductor multimedia personalizado. Muestra la inicialización del canal de medios, la carga de medios desde un archivo a la memoria y su posterior reproducción con capacidades de renderización de vídeo y audio. Los usuarios pueden interactuar con la reproducción multimedia a través de controles de interfaz de usuario para iniciar, detener, pausar y reanudar la reproducción, así como ajustar el volumen y buscar dentro de la línea de tiempo multimedia. El ejemplo también gestiona la inicialización y limpieza del SDK, la gestión de errores y actualiza dinámicamente la interfaz de usuario para reflejar la posición de reproducción actual y la duración de los medios. ## Características - Reproducir archivos multimedia desde la memoria ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Stream Player Demo\readme.md # Media Blocks SDK .Net - Memory Player Demo (WinForms) The sample demonstrates the use of the VisioForge Media Blocks SDK to build a custom media player. It showcases initializing the media pipeline, loading media from a file into memory, and then playing it with video and audio rendering capabilities. Users can interact with the media playback through UI controls to start, stop, pause, and resume playback, as well as adjust volume and seek within the media timeline. The example also handles SDK initialization and cleanup, error handling, and dynamically updates the UI to reflect the current playback position and media duration. ## Features - Play media files from memory ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - `StreamSourceBlock` - reads media from memory - `DecodeBinBlock` - decodes media files ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Video Mixer Demo\readme.es.md # Media Blocks SDK .Net - Video Mixer Demo (WinForms) La muestra SDK proporcionada muestra la implementación de una aplicación de mezcla de vídeo utilizando el VisioForge Media Blocks SDK .Net. La aplicación permite a los usuarios mezclar dos secuencias de vídeo en una sola salida. Cuenta con una clase CPUMixerEngine que maneja el proceso de mezcla, incluyendo la adición de flujos y el manejo de errores. El formulario principal, Form1, ofrece una interfaz gráfica de usuario donde los usuarios pueden seleccionar los archivos de vídeo, ajustar la posición y el tamaño de los flujos, e iniciar o detener el proceso de mezcla. El ejemplo muestra la inicialización del SDK, la configuración de secuencias de vídeo con coordenadas y dimensiones definidas por el usuario y la gestión del proceso de mezcla de vídeo, incluidas las funciones de inicio, parada y registro de errores. ## Características - Reproducción de múltiples secuencias de vídeo a partir de archivos de vídeo - Mezcla flujos de vídeo utilizando la CPU o la GPU ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\Video Mixer Demo\readme.md # Media Blocks SDK .Net - Video Mixer Demo (WinForms) The provided SDK sample showcases the implementation of a video mixing application using the VisioForge Media Blocks SDK .Net. The application allows users to mix two video streams into one output. It features a CPUMixerEngine class that handles the mixing process, including stream addition and error handling. The main form, Form1, offers a GUI where users can select video files, adjust the position and size of the streams, and start or stop the mixing process. The sample demonstrates initializing the SDK, setting up video streams with user-defined coordinates and dimensions, and managing the video mixing pipeline, including starting, stopping, and error-logging functionalities. ## Features - Play multiple video streams from video files - Mix video streams using CPU or GPU ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [VideoMixerBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoProcessing/VideoMixerBlock/) - mixes video streams - [NullRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/NullRendererBlock/) - discards video and audio data - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - reads video files ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\YouTube Player Demo\readme.es.md # Media Player SDK .Net - YouTube Player Demo (C#/WinForms) Este ejemplo del SDK muestra cómo crear un sencillo reproductor de vídeo de YouTube utilizando C#. Aprovecha el VisioForge Media Player SDK .Net y la biblioteca YoutubeExplode para obtener, seleccionar y reproducir secuencias de vídeo y audio de YouTube. Los usuarios pueden seleccionar sus formatos preferidos de vídeo y audio, iniciar y detener la reproducción, y navegar por la línea de tiempo de vídeo. El ejemplo incluye la gestión de errores para mejorar la fiabilidad y la experiencia del usuario. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WinForms\CSharp\YouTube Player Demo\readme.md # Media Player SDK .Net - YouTube Player Demo (C#/WinForms) This SDK sample demonstrates how to create a simple YouTube video player using C#. It leverages the VisioForge Media Player SDK .Net and the YoutubeExplode library to fetch, select, and play video and audio streams from YouTube. Users can select their preferred video and audio formats, start and stop playback, and navigate through the video timeline. The sample includes error handling to improve reliability and user experience. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\AlliedVision Source Demo\readme.es.md # Media Blocks SDK .Net - Allied Vision Source Demo (WPF) Allied Vision Source Demo es una aplicación que utiliza Media Blocks SDK .Net para previsualizar o capturar vídeo de las cámaras Allied Vision GigE/USB3/GenICam. ## Características - Reproducir vídeo desde la fuente de la cámara Allied Vision ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\AlliedVision Source Demo\readme.md # Media Blocks SDK .Net - Allied Vision Source Demo (WPF) Allied Vision Source Demo is an application that uses the Media Blocks SDK .Net to preview or capture video from Allied Vision GigE/USB3/GenICam cameras. ## Features - Play video from Allied Vision camera source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Audio Capture Demo\readme.es.md # VisioForge Media Blocks SDK .Net ## Audio Capture Demo (C#/WPF, cross-platform) La muestra, desarrollada por VisioForge Media Blocks SDK .Net, muestra un ejemplo completo de implementación de la funcionalidad de grabación de audio dentro de una aplicación WPF. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-blocks-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Audio Capture Demo\readme.md # VisioForge Media Blocks SDK .Net ## Audio Capture Demo (C#/WPF, cross-platform) The sample, powered by the VisioForge Media Blocks SDK .Net, showcases a comprehensive example of implementing audio recording functionality within a WPF application. ## Used blocks - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system audio input device - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - duplicates audio stream for recording and previewing - `MP3OutputBlock` - encodes and saves audio to an MP3 file ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-blocks-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Audio Mixer\readme.es.md # VisioForge Media Blocks SDK .Net ## Audio Capture Demo (C#/WPF, cross-platform) La muestra, desarrollada por VisioForge Media Blocks SDK .Net, muestra un ejemplo completo de implementación de la funcionalidad de grabación de audio dentro de una aplicación WPF. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-blocks-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Audio Mixer\readme.md # VisioForge Media Blocks SDK .Net ## Audio Capture Demo (C#/WPF, cross-platform) The sample, powered by the VisioForge Media Blocks SDK .Net, showcases a comprehensive example of implementing audio recording functionality within a WPF application. ## Used blocks - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system audio input device - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - duplicates audio stream for recording and previewing - `MP3OutputBlock` - encodes and saves audio to an MP3 file ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-blocks-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Basler Source Demo\readme.es.md # Media Blocks SDK .Net - Basler Source Demo (WPF) Basler Source Demo es una aplicación que utiliza Media Blocks SDK .Net para previsualizar o capturar vídeo de las cámaras Basler GigE/USB3/GenICam. ## Características - Reproducir vídeo desde la fuente de la cámara Basler ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Basler Source Demo\readme.md # Media Blocks SDK .Net - Basler Source Demo (WPF) Basler Source Demo is an application that uses the Media Blocks SDK .Net to preview or capture video from Basler GigE/USB3/GenICam cameras. ## Features - Play video from Basler camera source ## Used blocks - [BaslerSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/BaslerSourceBlock/) - captures video from Basler camera source - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Bridge Demo\readme.md # Media Blocks SDK .Net - Bridge Demo (WPF) SDK has bridges that allow video/audio streams to be sent from one pipeline to another. This demo will show you how to use them. In the demo, you can see two pipelines: source and file output. The source pipeline has video/audio sources, tees, renderers, and bridge sinks. ```mermaid graph LR; VirtualVideoSourceBlock-->TeeBlock-1; TeeBlock-1-->VideoRendererBlock; TeeBlock-1-->BridgeVideoSinkBlock; VirtualAudioSourceBlock-->TeeBlock-2; TeeBlock-2-->AudioRendererBlock; TeeBlock-2-->BridgeAudioSinkBlock; ``` The output pipeline has bridge sources, video/audio encoders, and a muxer. ```mermaid graph LR; BridgeVideoSourceBlock-->MP4OutputBlock; BridgeAudioSourceBlock-->MP4OutputBlock; ``` ## Features - Generate video/audio streams and save them as an MP4 file independently from the preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the pipeline into multiple branches - [BridgeVideoSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Bridge/BridgeVideoSinkBlock/) - sends video to another pipeline - [BridgeAudioSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Bridge/BridgeAudioSinkBlock/) - sends audio to another pipeline - [BridgeVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Bridge/BridgeVideoSourceBlock/) - receives video from another pipeline - [BridgeAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Bridge/BridgeAudioSourceBlock/) - receives audio from another pipeline - `MP4OutputBlock` - saves video/audio streams as an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Decklink Demo\readme.es.md # Media Blocks SDK .Net - Decklink Demo (WPF) El código proporcionado es para una aplicación de demostración utilizando el VisioForge Media Blocks SDK, centrándose específicamente en el procesamiento de los medios de comunicación y la integración de dispositivos Decklink. La aplicación muestra la creación de un canal de procesamiento de medios, incluyendo la selección de fuentes de vídeo y audio (ya sea desde dispositivos Decklink o archivos), renderizado de vídeo y audio en tiempo real, efectos de vídeo y codificación a varios formatos como MP4, WebM, MXF y MPEG2. Utiliza una amplia gama de Media Blocks de VisioForge para manejar diferentes tareas de procesamiento de medios, como el cambio de tamaño de vídeo, la adición de superposiciones de texto o imágenes, y la captura o salida a dispositivos Decklink. La interfaz gráfica de usuario permite la configuración dinámica de los ajustes de entrada y salida, incluida la selección de dispositivos, el modo de vídeo y el formato de archivo de salida, lo que demuestra la versatilidad del SDK en aplicaciones multimedia. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Decklink Demo\readme.md # Media Blocks SDK .Net - Decklink Demo (WPF) The provided code is for a demo application using the VisioForge Media Blocks SDK, specifically focusing on media processing and Decklink device integration. The application showcases the creation of a media processing pipeline, including video and audio source selection (either from Decklink devices or files), real-time video and audio rendering, video effects, and encoding to various formats such as MP4, WebM, MXF, and MPEG2. It utilizes a wide array of VisioForge's Media Blocks for handling different media processing tasks, such as video resizing, adding text or image overlays, and capturing from or outputting to Decklink devices. The GUI allows for dynamic configuration of input and output settings, including device selection, video mode, and output file format, demonstrating the SDK's versatility in multimedia applications. ## Used blocks ### Decklink source and output - [DecklinkVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Decklink/DecklinkVideoSourceBlock/) - captures video from Decklink device - [DecklinkAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Decklink/DecklinkAudioSourceBlock/) - captures audio from Decklink device - [DecklinkAudioSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Decklink/DecklinkAudioSinkBlock/) - outputs audio to Decklink device - [DecklinkVideoSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Decklink/DecklinkVideoSinkBlock/) - outputs video to Decklink device ### MP4 file output - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - muxes compressed video and audio to MP4 - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes video to H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes audio to AAC ### WebM file output - [WebMSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/WebMSinkBlock/) - muxes compressed video and audio to WebM - [VorbisEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/VorbisEncoderBlock/) - encodes audio to Vorbis - [VPXEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/VPXEncoderBlock/) - encodes video to VP8/VP9 ### MPEG-TS file output - [MPEGTSSinkSettings](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MPEGTSSinkBlock/) - muxes compressed video and audio to MPEG-TS - `MPEG2EncoderBlock` - encodes video to MPEG2 - [MP2EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/MP2EncoderBlock/) - encodes audio to MP2 ### MXF file output - [MXFSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MXFSinkBlock/) - muxes compressed video and audio to MXF - [DNxHDEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/DNxHDEncoderBlock/) - encodes video to DNxHD ### Other blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [VideoResizeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoProcessing/VideoResizeBlock/) - resizes video - `VideoEffectsWinBlock` - applies video effects - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - duplicates video and audio streams - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - reads video and audio from file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Decklink MultiOutput\readme.es.md # Media Blocks SDK .Net - Decklink MultiOutput Demo (WPF) Este ejemplo muestra cómo utilizar Media Blocks SDK .Net para crear una sencilla aplicación de streaming de vídeo generado a múltiples salidas Decklink. Las tarjetas Decklink son tarjetas profesionales de captura y reproducción de vídeo de Blackmagic Design. Son ampliamente utilizadas en la industria del broadcast. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Decklink MultiOutput\readme.md # Media Blocks SDK .Net - Decklink MultiOutput Demo (WPF) This sample demonstrates how to use the Media Blocks SDK .Net to create a simple app streaming generated video to multiple Decklink outputs. Decklink cards are professional video capture and playback cards from Blackmagic Design. They are widely used in the broadcast industry. ## Used blocks - [VirtualVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/VirtualVideoSourceBlock/) - generates video - [ScreenSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/ScreenSourceBlock/) - captures screen or window - [DecklinkVideoSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Decklink/DecklinkVideoSinkBlock/) - outputs video to Decklink device - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - duplicates video and audio streams ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Face AI MB Demo\readme.es.md # Media Blocks SDK .Net - Face AI Demo (WPF) El código proporcionado es un ejemplo completo de la integración de las capacidades de reconocimiento facial dentro de una aplicación WPF utilizando el SDK VisioForge. Esta aplicación permite a los usuarios seleccionar carpetas que contienen imágenes de personas conocidas y desconocidas, detectar rostros dentro de estas imágenes, archivos de vídeo o secuencias de cámaras web y, a continuación, cotejar estos rostros con una base de datos de personas conocidas. Los usuarios pueden iniciar los procesos de detección y reconocimiento de rostros a través de una interfaz fácil de usar, y los resultados se muestran en tiempo real. Además, la aplicación permite cargar y guardar datos de personas conocidas en y desde un archivo, lo que mejora su usabilidad para tareas repetitivas y escenarios de aprendizaje continuo. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Face AI MB Demo\readme.md # Media Blocks SDK .Net - Face AI Demo (WPF) The provided code is a comprehensive example of integrating face recognition capabilities within a WPF application using the VisioForge SDK. This application allows users to select folders containing known and unknown persons' images, detect faces within these images, video files, or webcam streams, and then match these faces against a database of known persons. Users can initiate face detection and recognition processes through a user-friendly interface, with results displayed in real-time. Additionally, the application supports loading and saving known persons' data to and from a file, enhancing its usability for repetitive tasks and continuous learning scenarios. ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Face Detector Live\readme.es.md # Media Blocks SDK .Net - Face Detector Live Demo (WPF) Este ejemplo del SDK muestra la implementación de una aplicación de detección y desenfoque de rostros en tiempo real utilizando el SDK .Net de VisioForge Media Blocks. La aplicación inicializa un canal de medios, capturando la entrada de vídeo de un dispositivo seleccionado, y ofrece al usuario la posibilidad de elegir entre detectar rostros o difuminarlos en tiempo real. Cuenta con una interfaz de usuario responsiva que se actualiza con información sobre los rostros detectados y proporciona controles para seleccionar dispositivos de entrada de vídeo, formatos y velocidades de fotogramas. Además, incluye funciones de gestión y registro de errores, lo que ilustra un enfoque sólido para integrar las tecnologías de procesamiento de medios de VisioForge en una aplicación WPF para mejorar las tareas de procesamiento de vídeo. ## Características - Detección de caras - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Face Detector Live\readme.md # Media Blocks SDK .Net - Face Detector Live Demo (WPF) This SDK sample demonstrates the implementation of a live face detection and blurring application using the VisioForge Media Blocks SDK .Net. The application initializes a media pipeline, capturing video input from a selected device, and offers the user the choice between detecting faces or blurring them in real-time. It features a responsive UI that updates with information about detected faces and provides controls for selecting video input devices, formats, and frame rates. Additionally, it includes error handling and logging capabilities, illustrating a robust approach to integrating VisioForge's media processing technologies into a WPF application for enhanced video processing tasks. ## Features - Face detection - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from a video capture device - `CVFaceDetectBlock` - detects faces in video frames - `CVFaceBlurBlock` - blurs detected faces in video frames ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\GenICam Source Demo\readme.es.md # Media Blocks SDK .Net - GenICam Source Demo (WPF) GenICam Source Demo es una aplicación que utiliza el SDK .Net de Media Blocks para previsualizar o capturar vídeo de cámaras que soporten el protocolo GenICam y estén conectadas mediante USB 3 o un GigE. ## Características - Reproducción de vídeo desde la fuente GenICam ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\GenICam Source Demo\readme.md # Media Blocks SDK .Net - GenICam Source Demo (WPF) GenICam Source Demo is an application that uses the Media Blocks SDK .Net to preview or capture video from cameras that support GenICam protocol and are connected using USB 3 or a GigE. ## Features - Play video from GenICam source ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [GenICamSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/GenICamSourceBlock/) - captures video from GenICam source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\HTTP MJPEG Source Demo\readme.es.md # Media Blocks SDK .Net - HTTP MJPEG Source Demo (WPF) El ejemplo proporcionado muestra cómo crear una aplicación utilizando VisioForge Media Blocks SDK .Net para la transmisión de vídeo MJPEG desde una URL a una aplicación WPF. Inicializa una canalización de medios con bloques de origen HTTP, decodificador JPEG y renderizador de vídeo, ofreciendo funcionalidad para iniciar y detener la transmisión de vídeo. Además, incluye la gestión de errores para depurar problemas dentro de la canalización de medios, mostrando las capacidades del SDK para el procesamiento y la renderización de vídeo en tiempo real en un entorno Windows. ## Características - Reproducción de vídeo desde la red Fuente MJPEG ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\HTTP MJPEG Source Demo\readme.md # Media Blocks SDK .Net - HTTP MJPEG Source Demo (WPF) The provided sample demonstrates how to build an application using the VisioForge Media Blocks SDK .Net for streaming MJPEG video from a URL to a WPF application. It initializes a media pipeline with HTTP source, JPEG decoder, and video renderer blocks, offering functionality to start and stop video streaming. Additionally, it includes error handling to debug issues within the media pipeline, showcasing the SDK's capabilities for real-time video processing and rendering in a Windows environment. ## Features - Play video from the network MJPEG source ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [HTTPSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/HTTPSourceBlock/) - reads HTTP data from the network - `JPEGDecoderBlock` - decodes JPEG frames ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\HTTP Source Demo\readme.es.md # Media Blocks SDK .Net - HTTP Source Demo (WPF) La muestra proporcionada demuestra cómo construir una aplicación utilizando el VisioForge Media Blocks SDK .Net para reproducir vídeo desde una URL HTTP en una aplicación WPF. Inicializa una canalización de medios con bloques de origen HTTP, demuxer+decoder universal y renderizador de vídeo, ofreciendo funcionalidad para iniciar y detener la transmisión de vídeo. ## Características - Reproducir vídeo desde la fuente HTTP de la red ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\HTTP Source Demo\readme.md # Media Blocks SDK .Net - HTTP Source Demo (WPF) The provided sample demonstrates how to build an application using the VisioForge Media Blocks SDK .Net for playing video from a HTTP URL in a WPF application. It initializes a media pipeline with HTTP source, universal demuxer+decoder, and video renderer blocks, offering functionality to start and stop video streaming. ## Features - Play video from the network HTTP source ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [HTTPSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/HTTPSourceBlock/) - reads HTTP data from the network - `DecodeBinBlock` - demuxes and decodes video data ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\KLV Demo\readme.es.md # Media Blocks SDK .Net - KLV Demo (WPF) Este ejemplo de SDK muestra la integración y el uso de VisioForge Media Blocks SDK .Net en una aplicación WPF para procesar archivos de vídeo. En concreto, se centra en la extracción de metadatos KLV (Key-Length-Value) de archivos MPEG-TS (Transport Stream). La aplicación inicializa el canal de medios, configura los bloques de origen para la entrada de archivos, demultiplexa el flujo MPEG-TS para extraer los metadatos y, a continuación, dirige los metadatos a un bloque de salida de archivos KLV. Además, incluye elementos de interfaz de usuario para seleccionar archivos TS de entrada y guardar los datos KLV extraídos en un archivo, proporcionando un ejemplo práctico de manejo de archivos multimedia y extracción de metadatos en un entorno .NET. ## Características - Demux y mux de datos KLV en archivos MPEG-TS ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\KLV Demo\readme.md # Media Blocks SDK .Net - KLV Demo (WPF) This SDK sample demonstrates the integration and usage of the VisioForge Media Blocks SDK .Net within a WPF application to process video files. Specifically, it focuses on extracting KLV (Key-Length-Value) metadata from MPEG-TS (Transport Stream) files. The application initializes the media pipeline, sets up source blocks for file input, demultiplexes the MPEG-TS stream to extract metadata, and then directs the metadata to a KLV file sink block for output. Additionally, it includes UI elements for selecting input TS files and saving the extracted KLV data to a file, providing a practical example of handling media files and extracting metadata in a .NET environment. ## Features - Demux and mux KLV data into MPEG-TS files ## Used blocks - `BasicFileSourceBlock` - reads data from a file without decoding - `MPEGTSDemuxBlock` - demultiplexes MPEG-TS streams - `KLVFileSinkBlock` - writes KLV data to a file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Live Video Compositor Demo\readme.es.md # Media Blocks SDK .Net - Live Video Compositor Demo (WPF) Este ejemplo del SDK demuestra el uso del Compositor de vídeo en directo (parte de Media Blocks SDK .Net) para crear una aplicación de mezcla de vídeo en directo. Incluye funciones para añadir y gestionar fuentes de vídeo y audio, como cámaras, archivos y capturas de pantalla, así como salidas como MP4, WebM, MP3 y dispositivos Decklink. La aplicación permite la composición en tiempo real de múltiples fuentes en un único flujo de salida, con controles de interfaz de usuario para la configuración de fuentes, la gestión de salidas y el control de la grabación. Muestra la integración del marco VisioForge en una aplicación WPF, aprovechando la programación asíncrona para gestionar las operaciones multimedia de forma eficaz. ## Características - Adición y eliminación de fuentes en tiempo real - Añadir y eliminar salidas en directo - Mezcla de vídeo y audio - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Live Video Compositor Demo\readme.md # Media Blocks SDK .Net - Live Video Compositor Demo (WPF) This SDK sample demonstrates the use of the Live Video Compositor (part of Media Blocks SDK .Net) for creating a live video mixing application. It includes functionality to add and manage video and audio sources, such as cameras, files, and screen captures, as well as outputs like MP4, WebM, MP3, and Decklink devices. The application allows for real-time composition of multiple sources into a single output stream, featuring UI controls for source configuration, output management, and recording control. It showcases the integration of the VisioForge framework within a WPF application, leveraging asynchronous programming to handle media operations efficiently. ## Features - Live adding and removing of sources - Live adding and removing of outputs - Video and audio mixing - Video preview ## Used blocks - [LiveVideoCompositor](https://www.visioforge.com/help/docs/dotnet/mediablocks/LiveVideoCompositor/) - composes video and audio sources using one pipeline - [LVCVideoViewOutput](https://www.visioforge.com/help/docs/dotnet/mediablocks/LiveVideoCompositor/LVCVideoViewOutput/) - displays video output - [LVCAudioOutput](https://www.visioforge.com/help/docs/dotnet/mediablocks/LiveVideoCompositor/LVCAudioOutput/) - outputs audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\NDI Source Demo\readme.es.md # Media Blocks SDK .Net - NDI Source Demo (WPF) Este ejemplo del SDK muestra cómo integrar y gestionar fuentes NDI (Network Device Interface) dentro de una aplicación WPF utilizando la API VisioForge Media Blocks. Proporciona una interfaz de usuario para seleccionar fuentes NDI, iniciar y detener el flujo de vídeo. La aplicación utiliza un MediaBlocksPipeline para manejar el procesamiento de vídeo y renderizado, incluyendo el manejo de errores y renderizado de vídeo en tiempo real. Además, cuenta con un temporizador para actualizar el tiempo de grabación que se muestra en la interfaz de usuario, mostrando cómo gestionar de forma asíncrona y disponer de los recursos multimedia dentro de una aplicación .NET. ## Características - Captura de vídeo desde una fuente NDI a un archivo MP4 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\NDI Source Demo\readme.md # Media Blocks SDK .Net - NDI Source Demo (WPF) This SDK sample demonstrates how to integrate and manage NDI (Network Device Interface) sources within a WPF application using the VisioForge Media Blocks API. It provides a user interface for selecting NDI sources and playing the video stream. The application utilizes a MediaBlocksPipeline for handling the video processing and rendering, including error handling and real-time video rendering. Additionally, it features a timer to update the recording time displayed in the UI, showcasing how to asynchronously manage and dispose of media resources within a .NET application. ## Features - Capture video from NDI source to MP4 file - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [NDISourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/NDISourceBlock/) - captures video from an NDI source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Networks Streamer Demo\readme.es.md # Media Blocks SDK .Net - Social Networks Streamer Demo (WPF) Este ejemplo del SDK muestra la integración y el uso del SDK VisioForge MediaBlocks en una aplicación WPF para transmitir vídeo y audio a varias plataformas, como YouTube y Facebook Live, o mediante el protocolo HLS. También puede transmitir utilizando un flujo HTTP MJPEG o a un cubo de AWS S3. Muestra cómo configurar una canalización de medios mediante la enumeración de dispositivos para fuentes de vídeo y audio, bloques de renderización de vídeo y audio, codificación con H.264 y AAC, y el uso de diferentes bloques de sumidero para la transmisión. La aplicación permite a los usuarios seleccionar sus dispositivos de entrada, configurar las fuentes y transmitir a la plataforma elegida con vídeo y audio en tiempo real. También proporciona gestión de errores y actualizaciones dinámicas del estado del streaming. ## Características - Transmisión de vídeo a YouTube y Facebook Live - Streaming de vídeo mediante protocolos HLS y MJPEG sobre HTTP - Transmisión de vídeo a un bucket de AWS S3 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Networks Streamer Demo\readme.md # Media Blocks SDK .Net - Social Networks Streamer Demo (WPF) This SDK sample demonstrates the integration and use of the VisioForge MediaBlocks SDK within a WPF application to stream video and audio to various platforms, including YouTube and Facebook Live, or using the HLS protocol. You can also stream using an HTTP MJPEG stream or to an AWS S3 bucket. It shows how to set up a media pipeline using device enumeration for video and audio sources, video and audio rendering blocks, encoding with H.264 and AAC, and using different sink blocks for streaming. The application allows users to select their input devices, configure source settings, and stream to their chosen platform with real-time video and audio. It also provides error handling and dynamic streaming status updates. ## Features - Video streaming to YouTube and Facebook Live - Video streaming using HLS and MJPEG over HTTP protocols - Video streaming to AWS S3 bucket - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from a device - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from a device - [YouTubeSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/YouTubeSinkBlock/) - streams video to YouTube - [FacebookLiveSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/FacebookLiveSinkBlock/) - streams video to Facebook Live - `AWSS3SinkBlock` - streams video to AWS S3 bucket - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file - [HTTPMJPEGLiveSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/HTTPMJPEGLiveSinkBlock/) - streams video using the MJPEG over HTTP protocol - [HLSSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/HLSSinkBlock/) - streams video using the HLS protocol - [SRTMPEGTSSinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/SRTMPEGTSSinkBlock/) - streams video using the SRT protocol with MPEG-TS muxer - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes video using H.264 ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Overlay Manager Demo\readme.es.md # Media Blocks SDK .Net - Overlay Manager Demo (WPF) Este ejemplo de SDK muestra cómo crear una aplicación multimedia utilizando VisioForge Media Blocks SDK .Net. La aplicación muestra funciones como la renderización de vídeo y audio, la gestión de fuentes y las superposiciones dinámicas, que incluyen texto, imágenes y formas. Los usuarios pueden seleccionar archivos multimedia para su reproducción, ajustar la línea de tiempo de reproducción y añadir diversas superposiciones como texto, imágenes, líneas, rectángulos y círculos sobre el vídeo. El ejemplo de código incluye la gestión de eventos de error y parada y demuestra los patrones de programación asíncrona dentro de una aplicación WPF. ## Características - Reproducir archivos multimedia - Añadir superposiciones al vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Overlay Manager Demo\readme.md # Media Blocks SDK .Net - Overlay Manager Demo (WPF) This SDK sample demonstrates how to create a multimedia application using the VisioForge Media Blocks SDK .Net. The application showcases features such as video and audio rendering, source management, and dynamic overlays, including text, images, and shapes. Users can select media files for playback, adjust the playback timeline, and add various overlays like text, images, lines, rectangles, and circles on the video. The code sample includes error and stop event handling and demonstrates asynchronous programming patterns within a WPF application. ## Features - Play media files - Add overlays to the video ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - `OverlayManagerBlock` - manages overlays ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\RTSP Preview Demo\readme.es.md # VisioForge Media Blocks SDK .Net ## RTSP Preview Demo (C#/WPF, cross-platform engine) El código proporcionado es un ejemplo completo de una aplicación Windows desarrollada utilizando el SDK .Net de VisioForge Media Blocks, que está diseñada para previsualizar secuencias de cámaras IP utilizando el RTSP. Además, la aplicación demuestra cómo enumerar las cámaras IP en la red local utilizando el protocolo ONVIF. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-blocks-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\RTSP Preview Demo\readme.md # VisioForge Media Blocks SDK .Net ## RTSP Preview Demo (C#/WPF, cross-platform engine) The provided code is a comprehensive example of a Windows application developed using the VisioForge Media Blocks SDK .Net, which is designed for previewing IP camera streams using the RTSP. Also, the app demonstrates how to enumerate IP cameras in the local network using the ONVIF protocol. ## Used blocks - [RTSPSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/RTSPSourceBlock/) - captures video from an RTSP source - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-blocks-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Screen Capture\readme.es.md # Media Blocks SDK .Net - Screen Capture Demo (WPF) Este ejemplo de SDK muestra cómo implementar una aplicación de captura y grabación de pantalla utilizando VisioForge Media Blocks SDK .Net en un entorno WPF. La aplicación muestra la configuración de un canal de bloques multimedia para capturar el contenido de la pantalla junto con el audio del sistema, renderizar ambos en la interfaz de usuario y codificarlos en un archivo. Destaca el uso de bloques de origen de pantalla y audio, bloques de renderizado de vídeo y audio, bloques de codificación para vídeo H264 y audio AAC, y guardado de la salida en un archivo MP4. Además, incluye la enumeración de dispositivos para seleccionar los dispositivos de entrada y salida de audio, la gestión de errores y la posibilidad de alternar entre los modos de previsualización y grabación. ## Características - Captura de vídeo de la pantalla a un archivo MP4 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Screen Capture\readme.md # Media Blocks SDK .Net - Screen Capture Demo (WPF) This SDK sample demonstrates how to implement a screen capture and recording application using the VisioForge Media Blocks SDK .Net in a WPF environment. The application showcases the setup of a media block pipeline for capturing screen content along with system audio, rendering both to the user interface and encoding them into a file. It highlights the usage of screen and audio source blocks, video and audio renderer blocks, encoding blocks for H264 video and AAC audio, and saving the output to an MP4 file. Additionally, it includes device enumeration for selecting audio input and output devices, error handling, and the capability to toggle between preview and recording modes. ## Features - Capture video from screen to MP4 file - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [ScreenSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/ScreenSourceBlock/) - captures video from the screen - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Simple Capture Demo\readme.es.md # Media Blocks SDK .Net - Simple Capture Demo (WPF) Este ejemplo del SDK muestra cómo crear una sencilla aplicación de captura de vídeo utilizando el SDK VisioForge Media Blocks .Net en un entorno WPF. La aplicación inicializa un canal de medios para capturar vídeo y audio desde dispositivos del sistema, renderizarlos en tiempo real y, opcionalmente, codificar y guardar el resultado en un archivo MP4. Muestra la enumeración de dispositivos, la selección de fuentes de vídeo y audio, la renderización de vídeo y audio en tiempo real y las capacidades de salida de archivos. La muestra incluye la gestión de errores y elementos de interfaz de usuario para la selección de dispositivos y formatos, demostrando un enfoque integrado para la captura y procesamiento de medios con la tecnología de VisioForge. ## Características - Captura de vídeo desde cámaras web a archivos MP4 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Simple Capture Demo\readme.md # Media Blocks SDK .Net - Simple Capture Demo (WPF) This SDK sample demonstrates how to build a simple video capture application using the VisioForge Media Blocks SDK .Net in a WPF environment. The application initializes a media pipeline for capturing video and audio from system devices, rendering them in real-time, and optionally encoding and saving the output to an MP4 file. It showcases device enumeration, video and audio source selection, real-time video and audio rendering, and file output capabilities. The sample includes error handling and UI elements for device and format selection, demonstrating an integrated approach to media capture and processing with VisioForge's technology. ## Features - Capture video from webcams to MP4 file - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Simple Player Core Demo\readme.es.md # Media Blocks SDK .Net - Simple Player Demo (WPF) El ejemplo Simple Player Core engine, parte del VisioForge Media Blocks SDK .Net, es un ejemplo completo que muestra cómo construir un reproductor multimedia utilizando el framework VisioForge. Esta aplicación de ejemplo muestra la inicialización del SDK, la gestión de la reproducción multimedia (incluidas las acciones de reproducción, pausa, parada y reanudación), la gestión de las interacciones del usuario para seleccionar archivos multimedia y el ajuste de la configuración de reproducción, como el volumen y la selección de secuencias. El código aprovecha el componente SimplePlayerCoreX para la reproducción de vídeo, se integra con los bloques multimedia para la reproducción de audio y gestiona operaciones asíncronas para una interfaz de usuario con capacidad de respuesta. Está diseñado para desarrolladores que deseen integrar funciones de reproducción multimedia en sus aplicaciones .NET, y ofrece información sobre la gestión de eventos, las actualizaciones de la interfaz de usuario basadas en el progreso de la reproducción y la gestión de flujos multimedia. ## Características - Reproducción de archivos multimedia - Reproducción de flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Simple Player Core Demo\readme.md # Media Blocks SDK .Net - Simple Player Demo (WPF) The Simple Player Core engine sample, part of the VisioForge Media Blocks SDK .Net, is a comprehensive example showcasing how to build a media player using the VisioForge framework. This sample application demonstrates initializing the SDK, managing media playback (including play, pause, stop, and resume actions), handling user interactions for selecting media files, and adjusting playback settings like volume and stream selection. The code leverages the SimplePlayerCoreX component for video playback, integrates with the Media Blocks for audio rendering, and handles asynchronous operations for a responsive UI. It's designed for developers looking to integrate media playback functionalities within their .NET applications, offering insights into event handling, UI updates based on playback progress, and media stream management. ## Features - Play media files - Play network streams - Seeking ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Simple Player Demo WPF\readme.es.md # Media Blocks SDK .Net - Simple Player Demo (WPF) El ejemplo muestra la integración de VisioForge Media Blocks SDK .Net en una aplicación WPF para crear un sencillo reproductor multimedia. Se muestra cómo inicializar el SDK, crear un canal de medios con bloques de renderizado de audio y vídeo, y manejar los archivos de origen de los medios de comunicación. La aplicación incluye una interfaz de usuario para seleccionar un archivo, controlar la reproducción (iniciar, detener, pausar, reanudar), ajustar el volumen y mostrar el tiempo de reproducción actual junto con la duración total del contenido multimedia. Este ejemplo destaca las capacidades del SDK para el desarrollo de reproductores multimedia personalizados, incluyendo el manejo de errores y las operaciones asíncronas para una experiencia de usuario mejorada. ## Características - Reproducción de archivos multimedia - Reproducción de flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Simple Player Demo WPF\readme.md # Media Blocks SDK .Net - Simple Player Demo (WPF) The sample demonstrates the integration of the VisioForge Media Blocks SDK .Net within a WPF application to create a simple media player. It showcases how to initialize the SDK, create a media pipeline with audio and video rendering blocks, and handle media source files. The application features a UI for selecting a file, controlling playback (start, stop, pause, resume), adjusting volume, and displaying the current playtime alongside the total duration of the media. This example highlights the SDK's capabilities for custom media player development, including error handling and asynchronous operations for an enhanced user experience. ## Features - Play media files - Play network streams - Seeking ## Used blocks - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\SRT Source Demo\readme.es.md # Media Blocks SDK .Net - SRT Source Demo (WPF) El ejemplo de código proporcionado demuestra cómo integrar MediaBlocks SDK de VisioForge para crear un visor de fuentes SRT dentro de una aplicación WPF. El protocolo de streaming Secure Reliable Transport (SRT) permite la entrega de flujos de vídeo y audio de alta calidad y baja latencia a través de Internet. Desarrollado originalmente por Haivision, SRT optimiza el rendimiento de la transmisión a través de redes impredecibles como la Internet pública, abordando retos como la pérdida de paquetes, la fluctuación de fase y la fluctuación del ancho de banda. Utiliza el cifrado de extremo a extremo para garantizar la seguridad e integra mecanismos de recuperación de errores para mantener la integridad del flujo. SRT es un protocolo de código abierto, ampliamente adoptado por su capacidad para mantener la calidad y seguridad de los flujos de medios en diversas aplicaciones de difusión y streaming. La aplicación configura un canal de medios y se conecta a una fuente SRT en función de los parámetros proporcionados por el usuario, como el host, el puerto y la contraseña. A continuación, muestra la salida de vídeo en la interfaz de usuario. Esta aplicación también incluye controles de inicio y parada para el visor SRT y una sencilla interfaz de usuario para introducir los detalles de la conexión. Además, la aplicación cuenta con gestión de errores dentro de la canalización de medios y actualiza el tiempo de grabación, lo que demuestra el potencial del SDK para el procesamiento y visualización de medios en tiempo real. ## Características - Reproducción de vídeo desde una fuente SRT ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\SRT Source Demo\readme.md # Media Blocks SDK .Net - SRT Source Demo (WPF) The provided code sample demonstrates how to integrate VisioForge's MediaBlocks SDK to create an SRT source viewer within a WPF application. The Secure Reliable Transport (SRT) streaming protocol enables the delivery of high-quality, low-latency video and audio streams over the internet. Originally developed by Haivision, SRT optimizes streaming performance across unpredictable networks like the public internet by addressing challenges such as packet loss, jitter, and fluctuating bandwidth. It uses end-to-end encryption to ensure security and integrates error recovery mechanisms to maintain stream integrity. SRT is an open-source protocol, widely adopted for its ability to maintain the quality and security of media streams in various broadcast and streaming applications. The application sets up a media pipeline and connects to an SRT source based on settings provided by the user, such as the host, port, and password. It then displays the video output on the user interface. This application also includes start and stop controls for the SRT viewer and a simple user interface for entering connection details. Furthermore, the application features error management within the media pipeline and updates the recording time, demonstrating the SDK's potential for real-time media processing and display. ## Features - Play video from SRT source ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SRTSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SRTSourceBlock/) - reads and decodes video/audio from an SRT source - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\USB3V-GigE Spinnaker\readme.es.md # Media Blocks SDK .Net - USB3V-GigE Spinnaker (FLIR/Teledyne) Demo (WPF) Spinnaker Source Demo es una aplicación que utiliza Media Blocks SDK .Net para previsualizar o capturar vídeo desde cámaras que soporten Spinnaker SDK y estén conectadas mediante USB 3 o un GigE. ## Características - Reproducir vídeo desde una fuente compatible con Spinnaker SDK ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\USB3V-GigE Spinnaker\readme.md # Media Blocks SDK .Net - USB3V-GigE Spinnaker (FLIR/Teledyne) Demo (WPF) Spinnaker Source Demo is an application that uses the Media Blocks SDK .Net to preview or capture video from cameras that support Spinnaker SDK and are connected using USB 3 or a GigE. ## Features - Play video from Spinnaker SDK-supported source ## Used blocks - `SpinnakerSourceBlock` - captures video from a Spinnaker SDK-supported source - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Video Compositor Demo\readme.es.md # Media Blocks SDK .Net - Video Compositor Demo (WPF) Este ejemplo del SDK muestra un enfoque completo de la composición y transmisión de vídeo utilizando el SDK VisioForge Media Blocks .Net en una aplicación WPF. El código muestra la creación y gestión de una canalización de medios capaz de capturar vídeo de diversas fuentes, como cámaras, pantallas o archivos. También ilustra cómo configurar flujos de salida para diferentes plataformas, incluidos archivos MP4, YouTube, Facebook Live y NDI. Además, el ejemplo incluye funciones de mezcla de vídeo en tiempo real, previsualización y modificación dinámica de las propiedades de la fuente, lo que demuestra la flexibilidad de la API para gestionar tareas complejas de procesamiento de vídeo. ## Características - Mezcla de vídeo de varias fuentes de vídeo - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\Video Compositor Demo\readme.md # Media Blocks SDK .Net - Video Compositor Demo (WPF) This SDK sample demonstrates a comprehensive approach to video composition and streaming using the VisioForge Media Blocks SDK .Net in a WPF application. The code showcases the creation and management of a media pipeline capable of capturing video from various sources such as cameras, screens, or files. It also illustrates how to configure output streams for different platforms, including MP4 files, YouTube, Facebook Live, and NDI. Furthermore, the sample includes functionality for real-time video mixing, previewing, and the dynamic modification of source properties, demonstrating the API's flexibility in handling complex video processing tasks. ## Features - Video mixing of several video sources - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [MP4OutputBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file - [YouTubeOutputBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/YouTubeSinkBlock/) - streams video to YouTube - [FacebookLiveOutputBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/FacebookLiveSinkBlock/) - streams video to Facebook Live - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from a device - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from a device - [NDISinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/NDISinkBlock/) - sends video using NDI - [VideoMixerBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoProcessing/VideoMixerBlock/) - mixes video streams - [VirtualAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/VirtualAudioSourceBlock/) - creates an audio source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\VNC Source Demo\readme.es.md # Media Blocks SDK .Net - VNC Source Demo (WPF) El ejemplo de código proporcionado demuestra cómo integrar el SDK MediaBlocks de VisioForge para crear un visor de origen VNC (Virtual Network Computing) dentro de una aplicación WPF. La aplicación inicializa un canal de medios, establece una conexión de origen VNC utilizando la configuración especificada por el usuario (como host, puerto y contraseña) y muestra la salida de vídeo en la interfaz de usuario. Incluye funciones de inicio y parada para el visor VNC, junto con una interfaz de usuario básica para introducir los detalles de la conexión. Además, la aplicación incorpora la gestión de errores dentro de la canalización de medios y actualiza el tiempo de grabación, mostrando las capacidades del SDK para el procesamiento y la visualización de medios en tiempo real. ## Características - Reproducción de vídeo desde una fuente VNC/RFB ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\WPF\CSharp\VNC Source Demo\readme.md # Media Blocks SDK .Net - VNC Source Demo (WPF) The provided code sample demonstrates how to integrate VisioForge's MediaBlocks SDK for creating a VNC (Virtual Network Computing) source viewer within a WPF application. The application initializes a media pipeline, establishes a VNC source connection using user-specified settings (such as host, port, and password), and renders the video output to the UI. It features start and stop functionality for the VNC viewer, along with a basic UI for inputting connection details. Additionally, the application incorporates error handling within the media pipeline and updates the recording time, showcasing the SDK's capabilities for real-time media processing and display. ## Features - Play video from VNC/RFB source ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - `VNCSourceBlock` - reads and decodes video/audio from a VNC/RFB source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\h264-data-player\readme.es.md # Media Blocks SDK .Net - Fragmento de código del reproductor de datos H264 (C#/WinForms) Este ejemplo del SDK muestra cómo reproducir datos RAW H264 desde un archivo o un flujo de red utilizando Media Blocks SDK para .Net. ## Bloques multimedia utilizados * `PushSourceBlock` - bloque fuente push para alimentar los datos al decodificador * `H264DecoderBlock` - para decodificar los datos H264 * `H264ParseBlock` - para analizar los datos H264 RAW * `VideoRendererBlock` - para mostrar el flujo de vídeo Traducción realizada con la versión gratuita del traductor DeepL.com ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\h264-data-player\readme.md # Media Blocks SDK .Net - H264 data player code snippet (C#/WinForms) This SDK sample shows how to play RAW H264 data from file or network stream using Media Blocks SDK for .Net. ## Used media blocks * `PushSourceBlock` - push source block to feed the data to the decoder * `H264DecoderBlock` - to decode the H264 data * `H264ParseBlock` - to parse H264 RAW data * `VideoRendererBlock` - to display the video stream ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\ip-camera-capture-mp4\readme.es.md # Media Blocks SDK .Net - IP camera capture to MP4 code snippet (C#/WinForms) El ejemplo muestra cómo capturar vídeo de una cámara IP y guardarlo en un archivo MP4. ## Bloques de medios utilizados * `RTSPSourceBlock` - para conectarse a la cámara IP y recibir fotogramas de vídeo * `MP4OutputBlock` - para guardar los fotogramas de vídeo en un archivo MP4 * `TeeBlock` - para dividir el flujo de vídeo en dos flujos * `VideoRendererBlock` - para mostrar el flujo de vídeo * `AudioRendererBlock` - para reproducir el flujo de audio ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\ip-camera-capture-mp4\readme.md # Media Blocks SDK .Net - IP camera capture to MP4 code snippet (C#/WinForms) The sample shows how to capture video from an IP camera and save it to an MP4 file. ## Used media blocks * `RTSPSourceBlock` - to connect to the IP camera and receive video frames * `MP4OutputBlock` - to save video frames to an MP4 file * `TeeBlock` - to split the video stream into two streams * `VideoRendererBlock` - to display the video stream * `AudioRendererBlock` - to play the audio stream ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\ip-camera-preview\readme.es.md # Media Blocks SDK .Net - IP camera preview code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo previsualizar vídeo de una cámara IP. ## Bloques de medios utilizados * `RTSPSourceBlock` - para conectarse a la cámara IP y recibir fotogramas de vídeo * `VideoRendererBlock` - para mostrar el flujo de vídeo * `AudioRendererBlock` - para reproducir el flujo de audio ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\ip-camera-preview\readme.md # Media Blocks SDK .Net - IP camera preview code snippet (C#/WinForms) This SDK sample shows how to preview video from an IP camera. ## Used media blocks * `RTSPSourceBlock` - to connect to the IP camera and receive video frames * `VideoRendererBlock` - to display the video stream * `AudioRendererBlock` - to play the audio stream ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\media-player\readme.es.md # Media Blocks SDK .Net - Fragmento de código de reproductor multimedia (C#/WinForms) Este ejemplo del SDK muestra cómo crear una sencilla aplicación de reproductor multimedia utilizando Media Blocks SDK .Net. ## Bloques multimedia utilizados * `UniversalSourceBlock` - para leer el archivo multimedia y decodificarlo * `VideoRendererBlock` - para mostrar el flujo de vídeo * `AudioRendererBlock` - para reproducir el flujo de audio ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\media-player\readme.md # Media Blocks SDK .Net - Media player code snippet (C#/WinForms) This SDK sample shows how to create a simple media player application using Media Blocks SDK for .Net. ## Used media blocks * `UniversalSourceBlock` - to read the media file and decode it * `VideoRendererBlock` - to display the video stream * `AudioRendererBlock` - to play the audio stream ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\memory-player\readme.es.md # Media Blocks SDK .Net - Fragmento de código del reproductor de datos H264 (C#/WinForms) Este ejemplo del SDK muestra cómo reproducir datos RAW H264 desde un archivo o un flujo de red utilizando Media Blocks SDK para .Net. ## Bloques multimedia utilizados * `PushSourceBlock` - bloque fuente push para alimentar los datos al decodificador * `H264DecoderBlock` - para decodificar los datos H264 * `H264ParseBlock` - para analizar los datos H264 RAW * `VideoRendererBlock` - para mostrar el flujo de vídeo Traducción realizada con la versión gratuita del traductor DeepL.com ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\memory-player\readme.md # Media Blocks SDK .Net - H264 data player code snippet (C#/WinForms) This SDK sample shows how to play RAW H264 data from file or network stream using Media Blocks SDK for .Net. ## Used media blocks * `PushSourceBlock` - push source block to feed the data to the decoder * `H264DecoderBlock` - to decode the H264 data * `H264ParseBlock` - to parse H264 RAW data * `VideoRendererBlock` - to display the video stream ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\read-file-info\readme.es.md # Media Player SDK .Net - Read file info code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación Windows Forms utilizando el VisioForge Media Player SDK .Net para inspeccionar archivos multimedia. La aplicación permite a los usuarios seleccionar un archivo y, a continuación, ofrece opciones para comprobar si el archivo se puede reproducir, leer información detallada sobre las secuencias de vídeo, audio y subtítulos, y extraer etiquetas de metadatos. Muestra cómo configurar el SDK, abrir un cuadro de diálogo de archivo, leer y mostrar información multimedia, incluidos detalles del códec, duración, resolución, relación de aspecto, velocidad de fotogramas, tasa de bits, etc., así como el manejo específico del audio y los subtítulos. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\read-file-info\readme.md # Media Player SDK .Net - Read file info code snippet (C#/WinForms) This SDK sample demonstrates how to build a Windows Forms application using the VisioForge Media Player SDK .Net to inspect media files. The application allows users to select a file, then provides options to check if the file is playable, read detailed information about the video, audio, and subtitle streams, and extract metadata tags. It showcases how to configure the SDK, open a file dialog, read and display media information, including codec details, duration, resolution, aspect ratio, frame rate, bitrate, and more, as well as handling audio and subtitle specifics. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\screen-capture-avi\readme.es.md # Media Blocks SDK .Net - Screen capture to AVI code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar la pantalla y guardarla en un archivo AVI. ## Bloques de medios utilizados * `ScreenSourceBlock` - para capturar la pantalla * `AVIOutputBlock` - para guardar los fotogramas de vídeo en un archivo AVI * `TeeBlock` - para dividir el flujo de vídeo en dos flujos * `VideoRendererBlock` - para mostrar el flujo de vídeo en una ventana ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\screen-capture-avi\readme.md # Media Blocks SDK .Net - Screen capture to AVI code snippet (C#/WinForms) This SDK sample shows how to capture the screen and save it to an AVI file. ## Used media blocks * `ScreenSourceBlock` - to capture the screen * `AVIOutputBlock` - to save video frames to an AVI file * `TeeBlock` - to split the video stream into two streams * `VideoRendererBlock` - to display the video stream in a window ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\screen-capture-mp4\readme.es.md # Media Blocks SDK .Net - Screen capture to MP4 code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar la pantalla y guardarla en un archivo MP4. ## Bloques de medios utilizados * `ScreenSourceBlock` - para capturar la pantalla * `MP4OutputBlock` - para guardar los fotogramas de vídeo en un archivo MP4 * `TeeBlock` - para dividir el flujo de vídeo en dos flujos * `VideoRendererBlock` - para mostrar el flujo de vídeo en una ventana ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\screen-capture-mp4\readme.md # Media Blocks SDK .Net - Screen capture to MP4 code snippet (C#/WinForms) This SDK sample shows how to capture the screen and save it to an MP4 file. ## Used media blocks * `ScreenSourceBlock` - to capture the screen * `MP4OutputBlock` - to save video frames to an MP4 file * `TeeBlock` - to split the video stream into two streams * `VideoRendererBlock` - to display the video stream in a window ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\speaker-capture\readme.es.md # Media Blocks SDK .Net - Speaker capture code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar el flujo de audio del altavoz y guardarlo en un archivo M4A. ## Bloques de medios utilizados * `SystemAudioSourceBlock` - para capturar el flujo de audio del altavoz usando la clase `LoopbackAudioCaptureDeviceSourceSettings * `M4AOutputBlock` - para guardar fotogramas de audio en un archivo M4A ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\speaker-capture\readme.md # Media Blocks SDK .Net - Speaker capture code snippet (C#/WinForms) This SDK sample shows how to capture the speaker audio stream and save it to an M4A file. ## Used media blocks * `SystemAudioSourceBlock` - to capture the speaker audio stream using `LoopbackAudioCaptureDeviceSourceSettings` class * `M4AOutputBlock` - to save audio frames to an M4A file ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-capture-image-overlay\readme.es.md # Video Capture SDK .Net - Captura de vídeo con superposición de imagen fragmento de código (C#/WinForms) Este fragmento de código muestra cómo capturar vídeo de una cámara web y superponer una imagen sobre el vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-capture-image-overlay\readme.md # Video Capture SDK .Net - Video capture with image overlay code snippet (C#/WinForms) This code snippet demonstrates how to capture video from a webcam and overlay an image on top of the video. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-capture-text-overlay\readme.es.md # Media Blocks SDK .Net - Video capture with text overlay code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar vídeo de una cámara web, superponerle texto y mostrarlo en una ventana. ## Bloques de medios utilizados * `SystemVideoSourceBlock` - para capturar el flujo de video de una webcam * `SystemAudioSourceBlock` - para capturar el flujo de audio de un micrófono * `TextOverlayBlock` - para superponer texto en el flujo de vídeo * `VideoRendererBlock` - para mostrar el flujo de vídeo en una ventana * `AudioRendererBlock` - para reproducir el flujo de audio * `TeeBlock` - para dividir el flujo de vídeo en dos flujos * `MP4OutputBlock` - para guardar fotogramas de vídeo en un archivo MP4 ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-capture-text-overlay\readme.md # Media Blocks SDK .Net - Video capture with text overlay code snippet (C#/WinForms) This SDK sample shows how to capture video from a webcam, overlay text on it, and display it in a window. ## Used media blocks * `SystemVideoSourceBlock` - to capture the video stream from a webcam * `SystemAudioSourceBlock` - to capture the audio stream from a microphone * `TextOverlayBlock` - to overlay text on the video stream * `VideoRendererBlock` - to display the video stream in a window * `AudioRendererBlock` - to play the audio stream * `TeeBlock` - to split the video stream into two streams * `MP4OutputBlock` - to save video frames to an MP4 file ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-capture-webcam-avi\readme.es.md # Media Blocks SDK .Net - Video capture to AVI code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar vídeo de una cámara web y guardarlo en un archivo AVI. ## Bloques de medios utilizados * `SystemVideoSourceBlock` - para capturar el flujo de video de una webcam * `SystemAudioSourceBlock` - para capturar el flujo de audio de un micrófono * `VideoRendererBlock` - para mostrar el flujo de vídeo en una ventana * `AudioRendererBlock` - para reproducir el flujo de audio * `AVIOutputBlock` - para guardar los fotogramas de vídeo en un archivo AVI * `TeeBlock` - para dividir el flujo de vídeo en dos flujos ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-capture-webcam-avi\readme.md # Media Blocks SDK .Net - Video capture to AVI code snippet (C#/WinForms) This SDK sample shows how to capture video from a webcam and save it to an AVI file. ## Used media blocks * `SystemVideoSourceBlock` - to capture the video stream from a webcam * `SystemAudioSourceBlock` - to capture the audio stream from a microphone * `VideoRendererBlock` - to display the video stream in a window * `AudioRendererBlock` - to play the audio stream * `AVIOutputBlock` - to save video frames to an AVI file * `TeeBlock` - to split the video stream into two streams ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-capture-webcam-mp4\readme.es.md # Media Blocks SDK .Net - Video capture to MP4 code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar vídeo de una cámara web y guardarlo en un archivo MP4. ## Bloques multimedia utilizados * `SystemVideoSourceBlock` - para capturar el flujo de video de una webcam * `SystemAudioSourceBlock` - para capturar el flujo de audio de un micrófono * `VideoRendererBlock` - para mostrar el flujo de vídeo en una ventana * `AudioRendererBlock` - para reproducir el flujo de audio * `MP4OutputBlock` - para guardar los fotogramas de vídeo en un archivo MP4 * `TeeBlock` - para dividir el flujo de vídeo en dos flujos ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-capture-webcam-mp4\readme.md # Media Blocks SDK .Net - Video capture to MP4 code snippet (C#/WinForms) This SDK sample shows how to capture video from a webcam and save it to an MP4 file. ## Used media blocks * `SystemVideoSourceBlock` - to capture the video stream from a webcam * `SystemAudioSourceBlock` - to capture the audio stream from a microphone * `VideoRendererBlock` - to display the video stream in a window * `AudioRendererBlock` - to play the audio stream * `MP4OutputBlock` - to save video frames to an MP4 file * `TeeBlock` - to split the video stream into two streams ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-preview-webcam-frame-capture\readme.es.md # Media Blocks SDK .Net - Video preview from a webcam with a frame capture code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar vídeo de una webcam y mostrarlo en una ventana. El ejemplo también muestra cómo capturar fotogramas de vídeo en un archivo JPEG. ## Bloques de medios utilizados * `SystemVideoSourceBlock` - para capturar el flujo de video de una webcam * `SystemAudioSourceBlock` - para capturar el flujo de audio de un micrófono * `VideoRendererBlock` - para mostrar el flujo de vídeo en una ventana * `AudioRendererBlock` - para reproducir el flujo de audio * `VideoSampleGrabberBlock` - para capturar fotogramas de vídeo ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\video-preview-webcam-frame-capture\readme.md # Media Blocks SDK .Net - Video preview from a webcam with a frame capture code snippet (C#/WinForms) This SDK sample shows how to capture video from a webcam and display it in a window. The sample also shows how to capture video frames to JPEG file. ## Used media blocks * `SystemVideoSourceBlock` - to capture the video stream from a webcam * `SystemAudioSourceBlock` - to capture the audio stream from a microphone * `VideoRendererBlock` - to display the video stream in a window * `AudioRendererBlock` - to play the audio stream * `VideoSampleGrabberBlock` - to capture video frames ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\webcam-preview\readme.es.md # Media Blocks SDK .Net - Video preview from a webcam code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar vídeo de una webcam y mostrarlo en una ventana. ## Bloques multimedia utilizados * `SystemVideoSourceBlock` - para capturar el flujo de video de una webcam * `SystemAudioSourceBlock` - para capturar el flujo de audio de un micrófono * `VideoRendererBlock` - para mostrar el flujo de vídeo en una ventana * `AudioRendererBlock` - para reproducir el flujo de audio ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Blocks SDK\_CodeSnippets\webcam-preview\readme.md # Media Blocks SDK .Net - Video preview from a webcam code snippet (C#/WinForms) This SDK sample shows how to capture video from a webcam and display it in a window. ## Used media blocks * `SystemVideoSourceBlock` - to capture the video stream from a webcam * `SystemAudioSourceBlock` - to capture the audio stream from a microphone * `VideoRendererBlock` - to display the video stream in a window * `AudioRendererBlock` - to play the audio stream ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Audio Player\readme.es.md # Media Player SDK .Net - Audio Player Demo (C#/WinForms) El ejemplo de código proporcionado muestra una aplicación de reproducción de audio construida utilizando el VisioForge Media Player SDK .Net en un entorno Windows Forms. La aplicación muestra funciones esenciales como la selección de archivos de audio, la reproducción, la pausa, la reanudación y la detención de la reproducción. Cuenta con una interfaz de usuario con botones para el control, un control deslizante de línea de tiempo para la búsqueda, y el ajuste de volumen y balance. El código destaca el uso de operaciones asíncronas para el control de la reproducción multimedia, el manejo de eventos para errores y eventos de parada, y demuestra cómo interactuar con las propiedades del SDK para fuente multimedia, dispositivos de salida y opciones de depuración. Este ejemplo es una demostración práctica de la creación de un reproductor de audio personalizado con funciones avanzadas como la gestión de listas de reproducción, la personalización de la salida de audio y el soporte de depuración para desarrolladores. [Visite la página del producto](https://www.visioforge.com/media-player-sdk-net) ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Audio Player\readme.md # Media Player SDK .Net - Audio Player Demo (C#/WinForms) The provided code sample showcases an audio player application built using the VisioForge Media Player SDK .Net in a Windows Forms environment. The application demonstrates essential functionalities such as selecting audio files, playing, pausing, resuming, and stopping playback. It features a user interface with buttons for control, a timeline slider for seeking, and volume and balance adjustment. The code highlights the use of asynchronous operations for media playback control, event handling for errors and stopping events, and demonstrates how to interact with the SDK's properties for media source, output devices, and debugging options. This example is a practical demonstration of building a custom audio player with advanced features like playlist management, audio output customization, and debug support for developers. [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\DVD Player\readme.es.md # Media Player SDK .Net - DVD Player Demo (C#/WinForms) Este ejemplo del SDK de C# muestra la integración de la funcionalidad de reproducción de DVD en una aplicación Windows Forms. Aprovechando el SDK de VisioForge, permite a los usuarios cargar y controlar contenidos de DVD, incluyendo funciones como la selección de capítulos, secuencias de audio y subtítulos. La muestra proporciona controles intuitivos para la manipulación de la reproducción, como reproducir, pausar, detener y navegar. Presenta una integración perfecta con VisioForge Media Player SDK .Net, ofreciendo una solución robusta para la reproducción de DVD dentro de aplicaciones personalizadas. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\DVD Player\readme.md # Media Player SDK .Net - DVD Player Demo (C#/WinForms) This C# SDK sample demonstrates the integration of DVD playback functionality into a Windows Forms application. Leveraging the VisioForge SDK, it enables users to load and control DVD content, including features like selecting chapters, audio streams, and subtitles. The sample provides intuitive controls for playback manipulation, such as play, pause, stop, and navigation. It showcases seamless integration with the VisioForge Media Player SDK .Net, offering a robust solution for DVD playback within custom applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Encrypted Memory Playback Demo\readme.es.md # Media Player SDK .Net - Encrypted Memory Playback Demo (C#/WinForms) Explore la reproducción de memoria cifrada con este ejemplo del SDK. Utilizando la funcionalidad principal de VisioForge, este ejemplo demuestra cómo descifrar y reproducir flujos de medios cifrados sin problemas. Entre sus características se incluyen el barrido dinámico de la línea de tiempo, el control de la reproducción de audio y vídeo, y la gestión de errores para una experiencia de usuario robusta. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Encrypted Memory Playback Demo\readme.md # Media Player SDK .Net - Encrypted Memory Playback Demo (C#/WinForms) Explore encrypted memory playback with this SDK sample. Utilizing VisioForge's core functionality, this sample demonstrates how to decrypt and play encrypted media streams seamlessly. Features include dynamic timeline scrubbing, audio and video playback control, and error handling for a robust user experience. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Karaoke Demo\readme.es.md # Media Player SDK .Net - Karaoke demo (C#/WinForms) Descubra la reproducción multimedia y el renderizado de archivos CDG con la muestra Karaoke Demo SDK. Este código C#, que utiliza el VisioForge Media Player SDK .Net, le permite controlar la reproducción multimedia y visualizar contenidos CDG fácilmente. Explore cómo reproducir, ajustar el audio y visualizar archivos CDG en una sencilla aplicación. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Karaoke Demo\readme.md # Media Player SDK .Net - Karaoke demo (C#/WinForms) Discover multimedia playback and CDG file rendering with the Karaoke Demo SDK sample. This C# code, using the VisioForge Media Player SDK .Net, lets you control media playback and visualize CDG content easily. Explore how to play, adjust audio, and view CDG files in a simple app. ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\madVR Demo\readme.es.md # Media Player SDK .Net - madVR Player Demo (C#/WinForms) Explore las capacidades de reproducción multimedia con el ejemplo del SDK de madVR Player. Este fragmento de código demuestra la integración del SDK en una aplicación Windows Forms para permitir la reproducción fluida de vídeo con funciones como la selección de archivos, el control de audio y la gestión de errores. Experimente la renderización de vídeo de alta calidad utilizando el renderizador de vídeo madVR. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\madVR Demo\readme.md # Media Player SDK .Net - madVR Player Demo (C#/WinForms) Explore media playback capabilities with the madVR Player SDK sample. This code snippet demonstrates integrating the SDK into a Windows Forms application to enable seamless video playback with features like file selection, audio control, and error handling. Experience high-quality video rendering using the madVR video renderer. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Main Demo\readme.es.md # Media Player SDK .Net - Main Demo (C#/WinForms) Este ejemplo de SDK muestra cómo crear una aplicación de reproducción multimedia utilizando varios componentes y bibliotecas. La aplicación es capaz de reproducir archivos de audio y vídeo utilizando el SDK multimedia de VisioForge para la reproducción multimedia y librerías .NET estándar para la interfaz de usuario y el manejo de archivos. Las funciones incluyen efectos de audio como amplificación, ecualización, amplificación dinámica, sonido 3D, bajos reales y cambio de tono. Las funciones de vídeo incluyen el ajuste del brillo y la saturación, la adición de efectos de escala de grises o de inversión, y el manejo de la visualización en pantalla (OSD) para texto e imágenes. La aplicación también admite la asignación de canales de audio, el zoom y la lectura de información de archivos. ## Funciones básicas * Reproducción de archivos de audio y vídeo * reproducción de fuentes de red * aplicar efectos de vídeo y audio * aplicar OSD * detectar movimiento * reconocer códigos de barras * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Main Demo\readme.md # Media Player SDK .Net - Main Demo (C#/WinForms) This SDK sample demonstrates how to create a media player application using various components and libraries. The application is capable of playing audio and video files using VisioForge's multimedia SDK for media playback and standard .NET libraries for UI and file handling. Features include audio effects like amplification, equalization, dynamic amplification, 3D sound, true bass, and pitch shift. Video capabilities include adjusting brightness and saturation, adding grayscale or invert effects, and handling on-screen display (OSD) for text and images. The application also supports audio channel mapping, zooming, and file information reading. ## Basic features * audio and video files playback * network sources playback * apply video and audio effects * apply OSD * detect motion * recognize barcodes * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Memory Stream\readme.es.md # Media Player SDK .Net - Memory Playback Demo (C#/WinForms) Este ejemplo de SDK muestra cómo crear una aplicación de reproducción multimedia utilizando VisioForge Media Player SDK .Net en una aplicación Windows Forms. Abarca la creación de un motor de reproducción multimedia, la reproducción de archivos multimedia desde flujos de archivos y flujos de memoria, y el manejo de las interacciones del usuario, como la reproducción, la pausa, la detención y la navegación por la línea de tiempo. Además, incluye funciones para ajustar la velocidad de reproducción, el volumen y el balance, y muestra cómo gestionar errores y mostrar información de depuración. La aplicación permite a los usuarios seleccionar archivos de vídeo, verlos dentro de la aplicación, y controlar la reproducción en una interfaz fácil de usar. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Memory Stream\readme.md # Media Player SDK .Net - Memory Playback Demo (C#/WinForms) This SDK sample demonstrates how to build a media player application using the VisioForge Media Player SDK .Net in a Windows Forms application. It covers creating a media player engine, playing media files from both file streams and memory streams, and handling user interactions such as play, pause, stop, and timeline navigation. Additionally, it includes features for adjusting playback speed, volume, and balance, and showcases how to handle errors and display debug information. The application allows users to select video files, view them within the application, and control playback in a user-friendly interface. ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Multiple Video Streams\readme.es.md # Media Player SDK .Net - Multiple Video Streams Demo (C#/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación Windows Forms utilizando el VisioForge Media Player SDK .Net para manejar múltiples secuencias de vídeo. La aplicación permite a los usuarios seleccionar un archivo, reproducirlo y gestionar la reproducción con opciones para pausar, reanudar, detener y navegar por la línea de tiempo de vídeo. Soporta la visualización de hasta cuatro secuencias de vídeo simultáneamente en diferentes paneles dentro de la interfaz. El código incluye métodos para inicializar y deshacerse del reproductor multimedia, gestión de errores y actualización de la interfaz de usuario basada en eventos de reproducción. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Multiple Video Streams\readme.md # Media Player SDK .Net - Multiple Video Streams Demo (C#/WinForms) This SDK sample demonstrates how to create a Windows Forms application using the VisioForge Media Player SDK .Net for handling multiple video streams. The application allows users to select a file, play it, and manage playback with options to pause, resume, stop, and navigate through the video timeline. It supports displaying up to four video streams simultaneously on different panels within the interface. The code includes methods for initializing and disposing of the media player, error handling, and updating the UI based on playback events. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Seamless Playback\readme.es.md # Media Player SDK .Net - Seamless Playback Demo (C#/WinForms) El ejemplo de código proporcionado es una aplicación Windows Forms que utiliza el VisioForge Media Player SDK .NET. Esta demostración muestra cómo crear un reproductor multimedia capaz de reproducir vídeos con la capacidad de añadir archivos a una lista de reproducción, controlar la reproducción (reproducir, pausar, detener, reanudar), ajustar la velocidad de reproducción y gestionar la reproducción sin interrupciones entre varios archivos. Demuestra la inicialización y gestión de dos motores de reproducción multimedia para permitir transiciones fluidas entre los vídeos de una lista de reproducción. La aplicación incluye funciones para la gestión de errores, la actualización de la interfaz de usuario en función de los eventos de reproducción y el ajuste dinámico de la línea de tiempo de vídeo y la velocidad de reproducción en función de la interacción del usuario. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Seamless Playback\readme.md # Media Player SDK .Net - Seamless Playback Demo (C#/WinForms) The provided code sample is a Windows Forms application utilizing the VisioForge Media Player SDK .NET. This demo showcases how to create a media player capable of playing videos with the ability to add files to a playlist, control playback (play, pause, stop, resume), adjust playback speed, and manage seamless playback between multiple files. It demonstrates the initialization and management of two media player engines to enable smooth transitions between videos in a playlist. The application includes functionality for error handling, updating the UI based on playback events, and dynamically adjusting the video timeline and playback speed based on user interaction. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Simple Video Player\readme.es.md # Media Player SDK .Net - Simple Video Player Demo (C#/WinForms) Este SDK de ejemplo describe una sencilla aplicación de reproducción de vídeo desarrollada utilizando el VisioForge Media Player SDK .Net. Demuestra cómo crear, controlar y disponer de un reproductor multimedia dentro de una aplicación Windows Forms. Los usuarios pueden seleccionar un archivo de vídeo para reproducirlo, ajustar la velocidad de reproducción, el volumen y el balance, y navegar por el vídeo desplazándose al fotograma siguiente o anterior. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Simple Video Player\readme.md # Media Player SDK .Net - Simple Video Player Demo (C#/WinForms) This sample SDK describes a simple video player application developed using the VisioForge Media Player SDK .Net. It demonstrates how to create, control, and dispose of a media player within a Windows Forms application. Users can select a video file to play, adjust playback speed, volume, and balance, and navigate through the video by moving to the next or previous frame. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Two Windows\readme.es.md # Media Player SDK .Net - Two Windows Demo (C#/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación multimedia utilizando el VisioForge Media Player SDK .NET. La aplicación incluye dos ventanas: la ventana principal (`Form1`) para controlar la reproducción multimedia (reproducir, pausar, detener, reanudar, ajustar el volumen/balance y navegar por la línea de tiempo) y una ventana secundaria (`Form2`) para visualizar el vídeo. Muestra el uso de la clase `MediaPlayerCore` para manejar la reproducción multimedia, incluyendo la configuración de una lista de reproducción, la reproducción en bucle, el ajuste de la configuración de audio y la gestión de configuraciones multipantalla. El código también ilustra el manejo de las interacciones del usuario, como la selección de archivos y el ajuste de la configuración de reproducción a través de la interfaz de usuario. La aplicación registra errores y soporta modos de depuración para la solución de problemas. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Two Windows\readme.md # Media Player SDK .Net - Two Windows Demo (C#/WinForms) This SDK sample demonstrates how to create a multimedia application using the VisioForge Media Player SDK .NET. The application features two windows: the main window (`Form1`) for controlling media playback (play, pause, stop, resume, adjust volume/balance, and navigate through the timeline) and a secondary window (`Form2`) for displaying video. It showcases the use of the `MediaPlayerCore` class to handle media playback, including setting up a playlist, looping playback, adjusting audio settings, and managing multi-screen setups. The code also illustrates handling user interactions, such as file selection and adjusting playback settings through the UI. The application logs errors and supports debugging modes for troubleshooting. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Video Mixing Demo\readme.es.md # Media Player SDK .Net - Video Mixing Demo (C#/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación de mezcla de vídeo utilizando el Media Player SDK .Net de VisioForge en una aplicación Windows Forms. Muestra cómo gestionar una lista de reproducción, incluyendo la adición de archivos y el ajuste de sus propiedades como la posición y la transparencia a través de la funcionalidad Picture-in-Picture (PIP). El código cubre la inicialización y eliminación del reproductor multimedia, el manejo de los controles de reproducción como reproducir, pausar, detener y reanudar, así como la actualización de la interfaz de usuario con la información de reproducción de vídeo. Se hace hincapié en el manejo de archivos multimedia de manera eficaz, proporcionando una interfaz fácil de usar para la reproducción y manipulación de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\Video Mixing Demo\readme.md # Media Player SDK .Net - Video Mixing Demo (C#/WinForms) This SDK sample demonstrates how to create a video mixing application using VisioForge's Media Player SDK .Net in a Windows Forms application. It showcases how to manage a playlist, including adding files and adjusting their properties like position and transparency through Picture-in-Picture (PIP) functionality. The code covers initializing and disposing of the media player, handling playback controls like play, pause, stop, and resume, as well as updating the UI with video playback information. It emphasizes handling multimedia files effectively, providing a user-friendly interface for video playback and manipulation. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\VR 360 Demo\readme.es.md # Media Player SDK .Net - VR 360 Demo (C#/WinForms) El ejemplo muestra cómo crear un reproductor de vídeo inmersivo de 360 grados utilizando VisioForge Media Player SDK .Net en una aplicación Windows Forms. Este ejemplo muestra la inicialización del reproductor multimedia, la gestión de los controles de reproducción como reproducir, pausar, detener y reanudar, y el ajuste de las propiedades de vídeo para el contenido de RV, incluyendo guiñada, cabeceo, balanceo y campo de visión (FOV). También incluye funcionalidades para seleccionar archivos de vídeo, ajustar el volumen y el balance, navegar a través de la línea de tiempo de vídeo, y manejar errores y eventos de finalización de reproducción. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\VR 360 Demo\readme.md # Media Player SDK .Net - VR 360 Demo (C#/WinForms) The sample showcases how to create an immersive 360-degree video player using the VisioForge Media Player SDK .Net in a Windows Forms application. This example demonstrates initializing the media player, managing playback controls like play, pause, stop, and resume, and adjusting video properties for VR content, including yaw, pitch, roll, and field of view (FOV). It also includes functionalities for selecting video files, adjusting volume and balance, navigating through the video timeline, and handling errors and playback completion events. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\YouTube Player Demo\readme.es.md # Media Player SDK .Net - YouTube Player Demo (C#/WinForms) Este ejemplo del SDK muestra cómo crear un sencillo reproductor de vídeo de YouTube utilizando C#. Aprovecha el VisioForge Media Player SDK .Net y la biblioteca YoutubeExplode para obtener, seleccionar y reproducir secuencias de vídeo y audio de YouTube. Los usuarios pueden seleccionar sus formatos preferidos de vídeo y audio, iniciar y detener la reproducción, y navegar por la línea de tiempo de vídeo. El ejemplo incluye la gestión de errores para mejorar la fiabilidad y la experiencia del usuario. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\CSharp\YouTube Player Demo\readme.md # Media Player SDK .Net - YouTube Player Demo (C#/WinForms) This SDK sample demonstrates how to create a simple YouTube video player using C#. It leverages the VisioForge Media Player SDK .Net and the YoutubeExplode library to fetch, select, and play video and audio streams from YouTube. Users can select their preferred video and audio formats, start and stop playback, and navigate through the video timeline. The sample includes error handling to improve reliability and user experience. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Audio Player\readme.es.md # Media Player SDK .Net - Audio Player Demo (VB.Net/WinForms) Este ejemplo del SDK muestra cómo crear una sencilla aplicación de reproducción multimedia utilizando el SDK VisioForge Media Player .Net en una aplicación Windows Forms escrita en VB.NET. Cubre funcionalidades básicas como seleccionar un archivo multimedia, reproducir, pausar, reanudar y detener la reproducción multimedia. Además, incluye funciones para ajustar el volumen y el balance de la salida de audio, navegar por la línea de tiempo multimedia y mostrar errores de reproducción. El código está estructurado para manejar de forma asíncrona las operaciones multimedia, asegurando una interfaz de usuario sensible durante la reproducción. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Audio Player\readme.md # Media Player SDK .Net - Audio Player Demo (VB.Net/WinForms) This SDK sample demonstrates how to create a simple media player application using the VisioForge Media Player SDK .Net in a Windows Forms application written in VB.NET. It covers basic functionalities such as selecting a media file, playing, pausing, resuming, and stopping media playback. Additionally, it includes features for adjusting the volume and balance of the audio output, navigating through the media timeline, and displaying playback errors. The code is structured to asynchronously handle media operations, ensuring a responsive user interface during playback. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\DVD Player\readme.es.md # Media Player SDK .Net - DVD Player Demo (VB.Net/WinForms) Este código de ejemplo muestra cómo utilizar VisioForge Media Player SDK .Net para crear un reproductor de DVD dentro de una aplicación Windows Forms. La aplicación es capaz de reproducir DVDs, ajustar el volumen y el balance, navegar por los títulos, capítulos y subtítulos del DVD y controlar la velocidad de reproducción. Los usuarios pueden seleccionar archivos multimedia a través de un diálogo de archivo estándar, manipular la reproducción con controles GUI como botones y deslizadores para el volumen, el balance y la navegación por la línea de tiempo. Además, muestra el manejo de eventos de reproducción multimedia como reproducción, pausa, parada y errores, proporcionando un ejemplo completo de integración de funcionalidades de reproducción multimedia en una interfaz de aplicación personalizada. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\DVD Player\readme.md # Media Player SDK .Net - DVD Player Demo (VB.Net/WinForms) This sample code demonstrates how to use the VisioForge Media Player SDK .Net to create a DVD player within a Windows Forms application. The application is capable of playing DVDs, adjusting volume and balance, navigating through DVD titles, chapters, and subtitles, and controlling playback speed. Users can select media files through a standard file dialog, manipulate playback with GUI controls like buttons and sliders for volume, balance, and timeline navigation. Additionally, it showcases the handling of media playback events such as play, pause, stop, and errors, providing a comprehensive example of integrating media playback functionalities in a custom application interface. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Main Demo\readme.es.md # Media Player SDK .Net - Main Demo (VB.Net/WinForms) Este ejemplo de kit de desarrollo de software (SDK) muestra cómo utilizar VisioForge Media Player SDK .Net para mejorar la reproducción y el procesamiento multimedia en aplicaciones VB.NET. Muestra funciones avanzadas como efectos de audio (p. ej., ecualización, cambio de tono), efectos de vídeo (p. ej., zoom, panorámica) y ajustes de salida (p. ej., capturas de pantalla, OSD). El código también destaca el manejo de eventos para los controles de reproducción multimedia y demuestra cómo aplicar efectos personalizados de procesamiento de vídeo y audio, ajustar la configuración de reproducción e interactuar con las funciones avanzadas del reproductor multimedia. ## Características * Reproducción de archivos de audio y vídeo * reproducción de fuentes de red * aplicar efectos de vídeo y audio * aplicar OSD * detectar movimiento * reconocer códigos de barras * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Main Demo\readme.md # Media Player SDK .Net - Main Demo (VB.Net/WinForms) This software development kit (SDK) sample demonstrates how to utilize the VisioForge Media Player SDK .Net for enhanced media playback and processing in VB.NET applications. It showcases advanced features such as audio effects (e.g., equalization, pitch shift), video effects (e.g., zoom, pan), and output settings (e.g., screenshots, OSD). The code also highlights event handling for media playback controls and demonstrates how to apply custom video and audio processing effects, adjust playback settings, and interact with the media player's advanced capabilities. ## Features * audio and video files playback * network sources playback * apply video and audio effects * apply OSD * detect motion * recognize barcodes * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Memory Stream\readme.es.md # Media Player SDK .Net - Memory Playback Demo (VB.Net/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación de reproducción multimedia utilizando el SDK VisioForge Media Player en Visual Basic .NET. Muestra cómo reproducir archivos de audio y vídeo desde flujos de archivos y flujos de memoria. Los usuarios pueden controlar la reproducción a través de elementos de interfaz de usuario como reproducir, pausar, detener, reanudar, ajustar el volumen y buscar a través de una línea de tiempo. Además, permite ajustar la velocidad de reproducción y el balance. El código también muestra cómo cargar medios de forma dinámica, determinar la disponibilidad de flujos de audio y vídeo, y seleccionar los dispositivos de salida y renderizadores adecuados en función del contenido multimedia. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Memory Stream\readme.md # Media Player SDK .Net - Memory Playback Demo (VB.Net/WinForms) This SDK sample demonstrates how to create a media player application using the VisioForge Media Player SDK in Visual Basic .NET. It showcases how to play audio and video files from both file streams and memory streams. Users can control playback through UI elements like play, pause, stop, resume, adjust volume, and seek through a timeline. Additionally, it supports adjusting playback speed and balance. The code also highlights how to dynamically load media, determine the availability of audio and video streams, and select appropriate output devices and renderers based on the media content. ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Two Windows\readme.es.md # Media Player SDK .Net - Two Windows Demo (VB.Net/WinForms) Este ejemplo muestra cómo integrar un reproductor multimedia en una aplicación VB .NET utilizando el Media Player SDK .Net de VisioForge. Muestra funcionalidades como reproducir, pausar, reanudar y detener archivos multimedia, ajustar la velocidad de reproducción, el volumen y el balance, y manejar la selección de archivos multimedia a través de un cuadro de diálogo para abrir archivos. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Two Windows\readme.md # Media Player SDK .Net - Two Windows Demo (VB.Net/WinForms) This sample demonstrates how to integrate a media player into a VB .NET application using VisioForge's Media Player SDK .Net. It showcases functionalities such as playing, pausing, resuming, and stopping media files, adjusting playback speed, volume, and balance, and handling media file selection through an open file dialog. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Video Player\readme.es.md # Media Player SDK .Net - Simple Video Player Demo (VB.Net/WinForms) Este ejemplo de SDK muestra cómo integrar y utilizar el SDK VisioForge Media Player en una aplicación VB.NET para crear un reproductor multimedia completo. El código abarca la inicialización del motor del reproductor multimedia, la carga y reproducción de archivos multimedia, y proporciona controles para la reproducción, como reproducir, pausar, detener, reanudar, ajustar el volumen, el equilibrio y navegar por la línea de tiempo multimedia. Muestra cómo manejar la selección de archivos multimedia, implementar operaciones asíncronas para el control multimedia y gestionar los ajustes de reproducción, incluyendo el modo de bucle y la salida de audio. Además, el ejemplo maneja las interacciones del usuario para la navegación multimedia y muestra información de reproducción multimedia dinámicamente. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinForms\VB .Net\Video Player\readme.md # Media Player SDK .Net - Simple Video Player Demo (VB.Net/WinForms) This SDK sample demonstrates how to integrate and utilize the VisioForge Media Player SDK in a VB.NET application to create a comprehensive media player. The code covers initializing the media player engine, loading and playing media files, and provides controls for playback such as play, pause, stop, resume, adjust volume, balance, and navigate through the media timeline. It showcases how to handle media file selection, implement asynchronous operations for media control, and manage playback settings including loop mode and audio output. Additionally, the example handles user interactions for media navigation and displays media playback information dynamically. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinUI\CSharp\Simple Media Player WinUI\readme.es.md # Media Player SDK .Net - Simple Video Player Demo (C#/WinUI) Esta aplicación de escritorio WinUI 3 muestra la integración de VisioForge Media Player SDK .NET. Cuenta con una interfaz fácil de usar para reproducir, pausar, reanudar y detener la reproducción de vídeo. Los usuarios pueden abrir archivos de vídeo con FileOpenPicker y controlar la reproducción mediante un control deslizante que ajusta la posición del vídeo. La aplicación también incluye una configuración personalizada del fondo de vídeo y actualiza periódicamente la interfaz de usuario para reflejar la posición y duración actuales de la reproducción. Demuestra el manejo de archivos multimedia, la implementación de un DispatcherTimer para las actualizaciones de la interfaz de usuario, y la gestión de las propiedades de la ventana de la aplicación, tales como el cambio de tamaño y la configuración de iconos, utilizando WinUI y Windows App SDK APIs. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WinUI\CSharp\Simple Media Player WinUI\readme.md # Media Player SDK .Net - Simple Video Player Demo (C#/WinUI) This WinUI 3 desktop application showcases the integration of the VisioForge Media Player SDK .NET. It features a user-friendly interface for playing, pausing, resuming, and stopping video playback. Users can open video files using the FileOpenPicker, and control playback through a slider that adjusts the video position. The application also includes a custom video background setting and periodically updates the UI to reflect the current playback position and duration. It demonstrates handling media files, implementing a DispatcherTimer for UI updates, and managing app window properties, such as resizing and setting icons, using WinUI and Windows App SDK APIs. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WPF\CSharp\madVR Demo\readme.es.md # Media Player SDK .Net - madVR Demo (C#/WPF) El ejemplo SDK proporcionado demuestra la integración y el uso del VisioForge Media Player SDK dentro de una aplicación WPF. Muestra cómo implementar funcionalidades básicas de reproducción multimedia, incluyendo reproducción, parada y selección de archivos, utilizando la clase `MediaPlayerCore`. El ejemplo también ilustra el uso del renderizador de vídeo madVR para mejorar la reproducción de vídeo. ## Características * Reproducción de archivos de audio y vídeo * Reproducción de fuentes de red * uso del renderizador de vídeo madVR ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WPF\CSharp\madVR Demo\readme.md # Media Player SDK .Net - madVR Demo (C#/WPF) The provided SDK sample demonstrates the integration and usage of the VisioForge Media Player SDK within a WPF application. It showcases how to implement basic media playback functionalities, including play, stop, and file selection, using the `MediaPlayerCore` class. The sample also illustrates the madVR video renderer usage for enhanced video playback. ## Features * audio and video files playback * network sources playback * madVR video renderer usage ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WPF\CSharp\Main Demo\readme.es.md # Media Player SDK .Net - Main Demo (C#/WPF) Este ejemplo de SDK muestra cómo integrar y utilizar el VisioForge Media Player SDK .Net en una aplicación WPF para obtener funciones avanzadas de reproducción multimedia. Muestra la carga y reproducción de archivos multimedia, la aplicación de diversos efectos de audio y vídeo, la gestión de eventos como errores e información multimedia, y la implementación de funciones personalizadas de procesamiento multimedia. El ejemplo incluye el uso de efectos de audio como ecualizador y amplificación dinámica, efectos de vídeo, desentrelazado y detección de movimiento. Este completo ejemplo sirve como guía práctica para que los desarrolladores aprovechen las potentes funcionalidades de procesamiento de medios de VisioForge en sus aplicaciones WPF. ## Características * reproducción de archivos de audio y vídeo * reproducción de fuentes de red * aplicar efectos de vídeo y audio * aplicar OSD * detectar movimiento * reconocer códigos de barras * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WPF\CSharp\Main Demo\readme.md # Media Player SDK .Net - Main Demo (C#/WPF) This SDK sample demonstrates how to integrate and utilize the VisioForge Media Player SDK .Net in a WPF application for advanced media playback capabilities. It showcases loading and playing media files, applying various audio and video effects, handling events such as errors and media information, and implementing custom media processing features. The example includes the use of audio effects like EQ and dynamic amplification, video effects, deinterlacing, and motion detection. This comprehensive sample serves as a practical guide for developers to leverage VisioForge's powerful media processing functionalities in their WPF applications. ## Features * audio and video files playback * network sources playback * apply video and audio effects * apply OSD * detect motion * recognize barcodes * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WPF\CSharp\Nvidia Maxine Player\readme.es.md # Media Player SDK .Net - Nvidia Maxine Player Demo (C#/WinForms) El fragmento de código proporcionado es una muestra de la implementación de un SDK para una aplicación de reproducción multimedia que utiliza el SDK Media Player de VisioForge, y que muestra específicamente la integración con los efectos de vídeo del SDK Maxine de Nvidia. La aplicación, construida con C# y WPF, demuestra funcionalidades como reproducir, pausar, reanudar y detener archivos multimedia. Permite a los usuarios aplicar efectos de vídeo avanzados basados en Nvidia Maxine, como eliminación de ruido, reducción de artefactos, aumento de escala y superresolución, configurables a través de una interfaz de usuario. El código incluye la gestión de eventos para errores de reproducción y eventos de parada, interacciones de interfaz de usuario para la selección de archivos y ajustes dinámicos de efectos de vídeo basados en la entrada del usuario. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WPF\CSharp\Nvidia Maxine Player\readme.md # Media Player SDK .Net - Nvidia Maxine Player Demo (C#/WinForms) The provided code snippet is a sample of an SDK implementation for a multimedia player application using VisioForge's Media Player SDK, specifically showcasing integration with Nvidia Maxine SDK video effects. The application, built with C# and WPF, demonstrates functionalities such as playing, pausing, resuming, and stopping media files. It allows users to apply advanced video effects powered by Nvidia Maxine, including denoise, artifact reduction, upscale, and super-resolution, configurable through a user interface. The code includes event handling for media playback errors and stop events, UI interactions for file selection, and dynamic video effect adjustments based on user input. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WPF\CSharp\Skinned Player\readme.es.md # Media Player SDK .Net - Skinned Player Demo (C#/WPF) Este ejemplo de SDK muestra cómo crear una aplicación de reproductor multimedia con skins utilizando VisioForge Media Player SDK. Muestra la implementación de skins personalizados, controles de reproducción multimedia y la funcionalidad de alternancia de pantalla completa en una aplicación WPF. ## Características * Reproducción de archivos de audio y vídeo * Reproducción de fuentes de red * interfaz con skins ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\WPF\CSharp\Skinned Player\readme.md # Media Player SDK .Net - Skinned Player Demo (C#/WPF) This SDK sample demonstrates how to create a skinned media player application using VisioForge Media Player SDK. It showcases the implementation of custom skins, media playback controls, and full-screen toggle functionality in a WPF application. ## Features * audio and video files playback * network sources playback * skinned interface ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\_CodeSnippets\memory-playback\readme.es.md # Media Player SDK .Net - Memory Playback code snippet (C#/WinForms) Este ejemplo del SDK demuestra la integración de las capacidades de reproducción multimedia de VisioForge en una aplicación Windows Forms. Se muestra cómo abrir un archivo de vídeo, leer el archivo en una matriz de bytes, y luego reproducirlo directamente desde la memoria utilizando MediaPlayerCore de VisioForge. El ejemplo incluye el manejo de ambos flujos de audio y vídeo, la comprobación de su disponibilidad, y el establecimiento de la configuración de reproducción adecuada. Aprovecha las capacidades de VisioForge para la reproducción de flujos de memoria, demostrando cómo crear un ManagedIStream a partir de un MemoryStream que contiene los datos del archivo de vídeo y, a continuación, iniciar la reproducción dentro de la instancia MediaPlayerCore. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\_CodeSnippets\memory-playback\readme.md # Media Player SDK .Net - Memory Playback code snippet (C#/WinForms) This SDK sample demonstrates the integration of VisioForge's media playback capabilities into a Windows Forms application. It showcases how to open a video file, read the file into a byte array, and then play it directly from memory using VisioForge's MediaPlayerCore. The sample includes handling both audio and video streams, checking for their availability, and setting appropriate playback settings. It leverages VisioForge's capabilities for memory stream playback, demonstrating how to create a ManagedIStream from a MemoryStream containing the video file's data and then initiating playback within the MediaPlayerCore instance. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\_CodeSnippets\read-file-info\readme.es.md # Media Player SDK .Net - Read file info code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación Windows Forms utilizando el VisioForge Media Player SDK .Net para inspeccionar archivos multimedia. La aplicación permite a los usuarios seleccionar un archivo y, a continuación, ofrece opciones para comprobar si el archivo se puede reproducir, leer información detallada sobre las secuencias de vídeo, audio y subtítulos, y extraer etiquetas de metadatos. Muestra cómo configurar el SDK, abrir un cuadro de diálogo de archivo, leer y mostrar información multimedia, incluidos detalles del códec, duración, resolución, relación de aspecto, velocidad de fotogramas, tasa de bits, etc., así como el manejo específico del audio y los subtítulos. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK\_CodeSnippets\read-file-info\readme.md # Media Player SDK .Net - Read file info code snippet (C#/WinForms) This SDK sample demonstrates how to build a Windows Forms application using the VisioForge Media Player SDK .Net to inspect media files. The application allows users to select a file, then provides options to check if the file is playable, read detailed information about the video, audio, and subtitle streams, and extract metadata tags. It showcases how to configure the SDK, open a file dialog, read and display media information, including codec details, duration, resolution, aspect ratio, frame rate, bitrate, and more, as well as handling audio and subtitle specifics. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\Android\MediaPlayer\readme.es.md # Media Player SDK .Net - Android Simple Player Demo Este ejemplo del SDK muestra la integración de MediaPlayerCoreX de VisioForge con `Xamarin.Android` para crear una aplicación de reproducción multimedia versátil. Muestra el manejo de las interacciones del usuario para controlar la reproducción multimedia, incluida la selección de archivos multimedia mediante `FilePicker` de Xamarin.Essentials, la actualización de la posición de reproducción con una barra de búsqueda y la visualización del tiempo de reproducción actual. Además, destaca las prácticas recomendadas para gestionar los recursos del reproductor multimedia, como la liberación correcta de los recursos del SDK al destruir la actividad. Este ejemplo sirve de guía completa para los desarrolladores que deseen integrar funciones avanzadas de reproducción multimedia en sus aplicaciones Android utilizando tecnologías de Xamarin y VisioForge. ## Funciones - Reproducir archivos multimedia - Reproducir flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Player SDK .Net product page](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\Android\MediaPlayer\readme.md # Media Player SDK .Net - Android Simple Player Demo This SDK sample demonstrates the integration of VisioForge's MediaPlayerCoreX with `Xamarin.Android` to create a versatile media player application. It showcases handling user interactions to control media playback, including picking media files using Xamarin.Essentials' `FilePicker`, updating playback position with a seek bar and displaying current playback time. Additionally, it highlights best practices for managing media player resources, such as properly releasing SDK resources upon activity destruction. This example serves as a comprehensive guide for developers looking to integrate advanced media playback features into their Android applications using Xamarin and VisioForge technologies. ## Features - Play media files - Play network streams - Seeking ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Player SDK .Net product page](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\Avalonia\Simple Media Player\readme.es.md # Media Player SDK .Net - Simple Video Player Avalonia Demo (AvaloniaUI) Esta aplicación basada en Avalonia muestra la integración del Media Player SDK .Net de VisioForge para la reproducción y control de vídeo dentro de una aplicación .NET Core. Demuestra la inicialización y eliminación de recursos multimedia, el manejo de la reproducción de vídeo con controles para iniciar, pausar, reanudar y detener, y el ajuste de la configuración de reproducción como el volumen y la velocidad. La aplicación también incluye funciones para seleccionar archivos de vídeo, mostrar información de vídeo e ilustrar las completas capacidades de gestión multimedia. A través de su interfaz gráfica de usuario, los usuarios pueden interactuar con la reproducción de vídeo, por lo que es un ejemplo práctico de cómo aprovechar el SDK de VisioForge dentro de una aplicación Avalonia para proyectos multimedia. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\Avalonia\Simple Media Player\readme.md # Media Player SDK .Net - Simple Video Player Avalonia Demo (AvaloniaUI) This Avalonia-based application showcases the integration of VisioForge's Media Player SDK .Net for video playback and control within a .NET Core application. It demonstrates initializing and disposing of multimedia resources, handling video playback with controls for start, pause, resume, and stop, and adjusting playback settings like volume and speed. The application also includes features for selecting video files, displaying video information, and illustrating comprehensive media management capabilities. Through its GUI, users can interact with the video playback, making it a practical example of leveraging VisioForge's SDK within an Avalonia application for multimedia projects. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\macOS\SimpleMediaPlayer\readme.es.md # Media Blocks SDK .Net - macOS Simple Player Demo El ejemplo SimpleMediaPlayerMBMac SDK demuestra la integración del SDK Media Blocks de VisioForge para crear una aplicación de reproducción multimedia en macOS. Muestra cómo inicializar y gestionar un canal de reproducción multimedia, incluyendo la renderización de vídeo y audio, utilizando `MediaBlocksPipeline`, `VideoRendererBlock` y `AudioRendererBlock`. La aplicación admite la carga y reproducción de varios formatos multimedia, la actualización de una posición de reproducción con un control deslizante y la visualización de vídeo dentro de una vista OpenGL personalizada. También se implementan interacciones esenciales de interfaz de usuario para iniciar, detener y abrir archivos multimedia, demostrando el manejo de tareas asíncronas y actualizaciones de interfaz de usuario en macOS. ## Características - Reproducción de archivos multimedia - Reproducción de flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\macOS\SimpleMediaPlayer\readme.md # Media Blocks SDK .Net - macOS Simple Player Demo The SimpleMediaPlayerMBMac SDK sample demonstrates the integration of VisioForge's Media Blocks SDK for creating a media player application on macOS. It showcases how to initialize and manage a media playback pipeline, including video and audio rendering, using `MediaBlocksPipeline`, `VideoRendererBlock`, and `AudioRendererBlock`. The application supports loading and playing various media formats, updating a playback position with a slider, and displaying video within a custom OpenGL view. Essential UI interactions for starting, stopping, and opening media files are also implemented, demonstrating asynchronous task handling and UI updates on macOS. ## Features - Play media files - Play network streams - Seeking ## Used blocks - [UniversalSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/UniversalSourceBlock/) - decodes media files - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\MAUI\SimplePlayer\readme.es.md # Media Blocks SDK .Net - MAUI Simple Player Demo El código proporcionado describe la implementación de una sencilla aplicación de reproductor multimedia utilizando VisioForge's Media Player SDK .Net para un proyecto MAUI (Multi-platform App UI). Demuestra la inicialización del reproductor multimedia, el manejo de los controles de reproducción como play, pausa, stop, y el ajuste de la velocidad de reproducción. El código también incluye la carga de medios desde un selector de archivos, la visualización de la duración de los medios, y la actualización de la interfaz de usuario en respuesta a eventos de reproducción. Además, se muestra el manejo de errores y la limpieza de los recursos tras el cierre de la aplicación. Este ejemplo está pensado para desarrolladores que deseen integrar funciones de reproducción multimedia en sus aplicaciones MAUI multiplataforma, aprovechando las completas capacidades de procesamiento multimedia de VisioForge. ## Características * Reproducción de archivos multimedia * Reproducción de flujos de red ## Búsqueda ## Versiones de .Net compatibles * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\MAUI\SimplePlayer\readme.md # Media Blocks SDK .Net - MAUI Simple Player Demo The provided code outlines the implementation of a simple media player application using VisioForge's Media Player SDK .Net for an MAUI (Multi-platform App UI) project. It demonstrates initializing the media player, handling playback controls like play, pause, stop, and adjusting playback speed. The code also includes loading media from a file picker, displaying media duration, and updating the UI in response to playback events. Additionally, it showcases handling errors and cleaning up resources upon application closure. This example is tailored for developers looking to integrate multimedia playback functionalities into their cross-platform MAUI applications, leveraging VisioForge's comprehensive media processing capabilities. ## Features * Play media files * Play network streams * Seeking ## Supported frameworks * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\MAUI\SkinnedPlayer\readme.es.md # Media Blocks SDK .Net - MAUI Skinned Player Demo El código proporcionado es para una aplicación de reproductor multimedia construida usando MAUI (Multi-platform App UI) con el framework multimedia de VisioForge. Esta aplicación cuenta con una interfaz de usuario de piel, el apoyo a la carga dinámica de la piel para la personalización. Se inicializa con la configuración predeterminada para la reproducción de vídeo, incluyendo el manejo de diferentes rutas de origen para Android y otras plataformas. El código muestra la carga de skins a partir de recursos incrustados, la configuración de la reproducción multimedia, la gestión de errores de reproducción y la limpieza adecuada de los recursos. Muestra la integración de los componentes multimedia de VisioForge para el renderizado y reproducción de vídeo en una aplicación multiplataforma, utilizando `SkiaSharp` para las operaciones de dibujo y el manejo de fuentes multimedia específicas de la plataforma. ## Características - Reproducción de archivos multimedia - Reproducción de flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\MAUI\SkinnedPlayer\readme.md # Media Blocks SDK .Net - MAUI Skinned Player Demo The provided code is for a media player application built using MAUI (Multi-platform App UI) with VisioForge's media framework. This application features a skinned UI, supporting dynamic skin loading for customization. It initializes with default settings for video playback, including handling different source paths for Android and other platforms. The code demonstrates loading skins from embedded resources, setting up media playback, handling playback errors, and ensuring proper resource cleanup. It showcases the integration of VisioForge's media components for video rendering and playback in a cross-platform application, utilizing `SkiaSharp` for drawing operations and handling platform-specific media sources. ## Features - Play media files - Play network streams - Seeking ## Supported frameworks - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WinForms\Karaoke Demo\readme.es.md # Media Player SDK .Net - Karaoke demo (C#/WinForms) La muestra creada utilizando el framework VisioForge Media Player SDK .Net proporciona una solución completa para crear aplicaciones de karaoke en .NET. Este ejemplo del SDK muestra cómo integrar funciones de reproducción de audio y vídeo, gestionar la selección de archivos y administrar controles de reproducción como reproducción, pausa, parada y ajuste de volumen en una aplicación Windows Forms. También demuestra el manejo de eventos para errores y detener la reproducción, junto con la utilización de la clase `MediaPlayerCoreX` de VisioForge para la selección del dispositivo de salida de audio y la gestión de la línea de tiempo para la sincronización del karaoke. ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WinForms\Karaoke Demo\readme.md # Media Player SDK .Net - Karaoke demo (C#/WinForms) The sample built using the VisioForge Media Player SDK .Net framework provides a comprehensive solution for creating karaoke applications in .NET. This SDK sample showcases how to integrate audio and video playback functionalities, handle file selection, and manage playback controls like play, pause, stop, and volume adjustment in a Windows Forms application. It also demonstrates event handling for errors and stopping playback, alongside utilizing VisioForge's `MediaPlayerCoreX` class for audio output device selection and timeline management for karaoke synchronization. ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WinForms\Main Demo\readme.es.md # Media Player SDK .Net - Main Demo (C#/WinForms) El código proporcionado muestra un ejemplo completo de una aplicación multimedia creada con el SDK de VisioForge, con diversas funciones, como efectos de audio y vídeo, detección de movimiento y lectura de códigos de barras, entre otras. La muestra configura efectos de audio y vídeo como amplificación, eco, ecualizador, cambio de tamaño, desentrelazado, balance de color y superposiciones. También demuestra el manejo de fuentes multimedia, la implementación de la detección de movimiento, la detección de códigos de barras y la provisión de elementos de interfaz de usuario para la interacción del usuario. La aplicación está diseñada para proporcionar una rica experiencia multimedia, permitiendo a los usuarios reproducir, pausar, detener y ajustar el contenido multimedia de forma dinámica. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WinForms\Main Demo\readme.md # Media Player SDK .Net - Main Demo (C#/WinForms) The provided code demonstrates a comprehensive sample of a multimedia application built using the VisioForge SDK, showcasing a variety of features, including audio and video effects, motion detection, barcode reading, and more. The sample configures audio and video effects such as amplification, echo, equalizer, resize, deinterlace, color balance, and overlays. It also demonstrates handling media sources, implementing motion detection, barcode detection, and providing UI elements for user interaction. The application is designed to provide a rich multimedia experience, allowing users to play, pause, stop, and adjust media content dynamically. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WinForms\Video Mixer Player\readme.es.md # Media Player SDK .Net - Video Mixer Player Demo (WinForms, cross-platform SDK engine) Este ejemplo de SDK muestra la creación de una aplicación de mezcla y reproducción de vídeo utilizando el motor VisioForge Live Video Compositor dentro de una aplicación Windows Forms. El código inicializa el SDK, gestiona las fuentes de vídeo y audio, y controla la reproducción a través de una línea de tiempo. Permite añadir archivos de vídeo como fuentes dinámicamente a través de un diálogo de archivo, ajustar sus posiciones en la línea de tiempo y mostrar la salida mezclada en un renderizador de vídeo. Los usuarios pueden controlar y manipular la posición de reproducción y la duración de las fuentes en tiempo real. La aplicación también se encarga de la inicialización y eliminación del SDK para garantizar una gestión adecuada de los recursos. ## Características - Reproducción de archivos multimedia - Mezcla de flujos de vídeo y audio - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Player SDK .Net product page](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WinForms\Video Mixer Player\readme.md # Media Player SDK .Net - Video Mixer Player Demo (WinForms, cross-platform SDK engine) This SDK sample demonstrates the creation of a video mixing and playback application using the VisioForge Live Video Compositor engine within a Windows Forms application. The code initializes the SDK, manages video and audio sources, and controls playback through a timeline. It allows adding video files as sources dynamically through a file dialog, adjusting their positions on the timeline, and displaying the mixed output in a video renderer. Users can monitor and manipulate the playback position and duration of the sources in real-time. The application also handles SDK initialization and disposal to ensure proper resource management. ## Features - Play media files - Mixing video and audio streams - Seeking ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Player SDK .Net product page](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WinUI\Simple Media Player WinUI\readme.es.md # Media Player SDK .Net - Simple Video Player Demo (C#/WinUI) Simple Media Player for WinUI 3 Desktop es una aplicación ligera creada utilizando las bibliotecas multiplataforma Visio Media Player SDK .Net, diseñada para demostrar las funcionalidades básicas de reproducción multimedia en un entorno Windows UI. Cuenta con una interfaz fácil de usar que permite a los usuarios abrir, reproducir, pausar, reanudar y detener archivos de vídeo. La aplicación también proporciona una barra de búsqueda para navegar a través de la línea de tiempo de vídeo y control de volumen para ajustar el nivel de audio. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WinUI\Simple Media Player WinUI\readme.md # Media Player SDK .Net - Simple Video Player Demo (C#/WinUI) The Simple Media Player for WinUI 3 Desktop is a lightweight application built using the Visio Media Player SDK .Net cross-platform libraries, designed to demonstrate basic media playback functionalities within a Windows UI environment. It features a user-friendly interface that allows users to open, play, pause, resume, and stop video files. The application also provides a seek bar to navigate through the video timeline and volume control to adjust the audio level. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WPF\Decklink Output Player Demo\readme.es.md # Media Player SDK .Net - Decklink Output Player Demo (cross-platform WPF) Este ejemplo de SDK demuestra la integración y utilización de VisioForge Media Player SDK .Net en una aplicación WPF, centrándose específicamente en las funcionalidades de reproducción con configuraciones avanzadas de salida de audio y vídeo, incluyendo soporte Decklink. El código muestra la inicialización del reproductor multimedia, la configuración de los dispositivos de salida de audio y vídeo (con especial énfasis en los sumideros de vídeo y audio Decklink), y los controles para la reproducción multimedia como inicio, parada, pausa y reanudación. También incluye gestión de errores y soporte de depuración, lo que demuestra un enfoque integral para la creación de una interfaz de reproducción multimedia en aplicaciones .NET. ## Características - Reproducir archivos multimedia - Transmisión al dispositivo Decklink - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Player SDK .Net product page](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WPF\Decklink Output Player Demo\readme.md # Media Player SDK .Net - Decklink Output Player Demo (cross-platform WPF) This SDK sample demonstrates the integration and utilization of VisioForge Media Player SDK .Net in a WPF application, specifically focusing on playback functionalities with advanced audio and video output configurations, including Decklink support. The code showcases the initialization of the media player, setup of audio and video output devices (with a special emphasis on Decklink video and audio sinks), and controls for media playback such as start, stop, pause, and resume. It also includes error handling and debugging support, demonstrating a comprehensive approach to building a media playback interface in .NET applications. ## Features - Play media files - Stream to the Decklink device - Seeking ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Player SDK .Net product page](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WPF\Simple Player Demo\readme.es.md # Media Player SDK .Net - Simple Player Demo (cross-platform engine, WPF) Este ejemplo muestra una sencilla aplicación de reproducción multimedia que utiliza VisioForge Media Player SDK .NET. Utiliza la clase `MediaPlayerCoreX` para reproducir archivos de audio y vídeo, incluyendo controles básicos para reproducir, pausar, detener y ajustar el volumen. La aplicación también cuenta con un control deslizante de la línea de tiempo para la búsqueda, selección dinámica del dispositivo de salida de audio y soporte de subtítulos. La gestión de errores y el registro se incorporan para garantizar una experiencia de usuario fluida. Este ejemplo constituye un punto de partida práctico para los desarrolladores que deseen integrar funciones de reproducción multimedia en sus aplicaciones WPF utilizando el completo SDK de procesamiento multimedia de VisioForge. ## Características - Reproducción de archivos multimedia - Reproducir flujos de red - Búsqueda ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Player SDK .Net product page](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Media Player SDK X\WPF\Simple Player Demo\readme.md # Media Player SDK .Net - Simple Player Demo (cross-platform engine, WPF) This sample demonstrates a simple media player application using the VisioForge Media Player SDK .NET. It uses the `MediaPlayerCoreX` class to play audio and video files, including basic controls for play, pause, stop, and volume adjustment. The application also features a timeline slider for seeking, dynamic audio output device selection, and subtitle support. Error handling and logging are incorporated to ensure a smooth user experience. This example provides a practical starting point for developers looking to integrate media playback functionalities into their WPF applications using VisioForge's comprehensive media processing SDK. ## Features - Play media files - Play network streams - Seeking ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Player SDK .Net product page](https://www.visioforge.com/media-player-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Console\TV Tuner Demo\readme.es.md # Video Capture SDK .Net - TV Tuner Demo CLI (C#/WPF) Este ejemplo de SDK muestra cómo implementar una aplicación de demostración de sintonizador de TV utilizando el SDK de captura de vídeo VisioForge .Net. El programa muestra cómo enumerar dispositivos de captura de vídeo y audio, seleccionar un sintonizador de TV y configurarlo para diferentes modos, como previsualización de vídeo, captura a AVI o captura a MP4. Incluye el manejo de la sintonización de canales y permite la interacción del usuario para seleccionar dispositivos y modos de captura. El código también ilustra la configuración de las propiedades de captura de vídeo y audio, la sintonización de canales de TV, y el inicio o detención del proceso de captura basado en la entrada del usuario. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Console\TV Tuner Demo\readme.md # Video Capture SDK .Net - TV Tuner Demo CLI (C#/WPF) This SDK sample demonstrates how to implement a TV Tuner demo application using the VisioForge Video Capture SDK .Net. The program showcases how to enumerate video and audio capture devices, select a TV tuner, and configure it for different modes such as video preview, capture to AVI, or capture to MP4. It includes handling channel tuning and allows for user interaction to select devices and capture modes. The code also illustrates setting up video and audio capture properties, tuning TV channels, and starting or stopping the capture process based on user input. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Console\Video Capture Demo\readme.es.md # Video Capture SDK .Net - Video Capture Demo CLI (C#/Console) Este código de ejemplo demuestra el uso de VisioForge Video Capture SDK .Net para crear una aplicación de consola que captura vídeo y audio desde dispositivos seleccionados. Los usuarios pueden elegir entre los dispositivos de captura de vídeo disponibles, seleccionar los formatos de vídeo y audio, y especificar la velocidad de fotogramas. El programa ofrece la opción de capturar los medios en formato AVI o MP4, según la selección del usuario. Además, gestiona los errores con elegancia, mostrándolos en la consola sin interrumpir el proceso de captura. Este ejemplo es una demostración directa de la integración de funcionalidades de captura de vídeo y audio en aplicaciones utilizando el SDK de VisioForge. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Console\Video Capture Demo\readme.md # Video Capture SDK .Net - Video Capture Demo CLI (C#/Console) This sample code demonstrates the use of the VisioForge Video Capture SDK .Net to create a console application that captures video and audio from selected devices. Users can choose from available video capture devices, select video and audio formats, and specify the frame rate. The program offers the option to capture the media either in AVI or MP4 format, based on the user's selection. Additionally, it handles errors gracefully, displaying them to the console without interrupting the capture process. This example is a straightforward demonstration of integrating video and audio capture functionalities into applications using VisioForge's SDK. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Service\IP Capture\readme.es.md # Video Capture SDK .Net - IP capture service Demo Este ejemplo del SDK para el servicio de captura de vídeo de VisioForge muestra cómo crear un servicio Windows para capturar vídeo de cámaras IP. La clase `Service1` inicializa el proceso de captura de vídeo. Utiliza la clase `VideoCaptureCore` para establecer el modo de captura, la fuente y el formato de salida, específicamente la salida MP4. Además, incluye métodos para iniciar y detener el proceso de captura, junto con la gestión de errores que registra los mensajes en Windows Events. Esta muestra proporciona un ejemplo práctico de la integración de las capacidades de captura de vídeo de VisioForge en una aplicación basada en servicios. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Service\IP Capture\readme.md # Video Capture SDK .Net - IP capture service Demo This SDK sample for VisioForge's video capture service demonstrates how to create a Windows service for capturing video from IP cameras. The `Service1` class initializes the video capture process. It uses the `VideoCaptureCore` class to set the capture mode, source, and output format, specifically targeting MP4 output. Additionally, it includes methods for starting and stopping the capture process, along with error handling that logs messages to Windows Events. This sample provides a practical example of integrating VisioForge's video capture capabilities into a service-based application. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Service\Screen Capture\Helper\readme.es.md # Video Capture SDK .Net - Screen capture service demo La demo muestra cómo crear un servicio de Windows para capturar vídeos de la pantalla. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Service\Screen Capture\Helper\readme.md # Video Capture SDK .Net - Screen capture service demo The demo shows how to create a windows service for capturing videos from the screen. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Service\Screen Capture\Service\readme.es.md # Video Capture SDK .Net - Screen capture service demo El servicio de ejemplo es un servicio de Windows para la captura de pantalla diseñado utilizando VisioForge Video Capture SDK .Net. Este servicio proporciona funcionalidad para iniciar y detener sesiones de captura de vídeo mediante programación. Aprovecha la creación de procesos hijo en un hilo separado para una operación eficiente, evitando el bloqueo del hilo principal. Adicionalmente, el servicio incluye capacidades para registrar eventos en el Registro de Eventos de Windows para propósitos de monitoreo y depuración, asegurando una solución de captura de pantalla robusta y confiable dentro de aplicaciones .NET. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\Service\Screen Capture\Service\readme.md # Video Capture SDK .Net - Screen capture service demo The service sample is a Windows service for screen capturing designed using VisioForge Video Capture SDK .Net. This service provides functionality to start and stop video capture sessions programmatically. It leverages the creation of child processes in a separate thread for efficient operation, avoiding the main thread's blocking. Additionally, the service includes capabilities to log events into Windows Event Log for monitoring and debugging purposes, ensuring a robust and reliable screen capture solution within .NET applications. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Audio Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Audio Capture Demo (C#/WinForms) El VisioForge Video Capture SDK .Net proporciona una solución completa para la integración de capacidades de captura de audio en aplicaciones .NET. Permite la selección y configuración de dispositivos de entrada y salida de audio, admite múltiples formatos de audio (incluidos MP3, WAV, WMA, FLAC, etc.) y ofrece diversos efectos de audio, como amplificación, ecualización, bajos reales y sonido 3D. El SDK cuenta con un modelo basado en eventos para gestionar errores y capturar eventos de parada, e incluye cuadros de diálogo para configurar ajustes específicos del formato de audio. Este SDK es ideal para desarrolladores que deseen añadir funciones avanzadas de grabación y procesamiento de audio a sus aplicaciones Windows Forms. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Audio Capture\readme.md # VisioForge Video Capture SDK .Net ## Audio Capture Demo (C#/WinForms) The VisioForge Video Capture SDK .Net provides a comprehensive solution for integrating audio capture capabilities into .NET applications. It allows for the selection and configuration of audio input and output devices, supports multiple audio formats (including MP3, WAV, WMA, FLAC, and more), and offers a variety of audio effects such as amplification, equalization, true bass, and 3D sound. The SDK features an event-driven model for handling errors and capturing stop events and includes dialogs for configuring specific audio format settings. This SDK is ideal for developers looking to add advanced audio recording and processing functionality to their Windows Forms applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Camera Light Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Camera Light Demo Este ejemplo muestra cómo crear una sencilla aplicación Windows Forms utilizando el VisioForge Video Capture SDK .Net para gestionar la funcionalidad de la linterna de la cámara. La aplicación inicializa el dispositivo de captura de vídeo y enumera los dispositivos de cámara disponibles capaces de controlar la linterna. Los usuarios pueden encender o apagar la linterna de la cámara con sólo pulsar un botón. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Camera Light Demo\readme.md # VisioForge Video Capture SDK .Net ## Camera Light Demo This sample demonstrates how to create a simple Windows Forms application using the VisioForge Video Capture SDK .Net to manage camera torch (flashlight) functionality. The application initializes the video capture device and lists available camera devices capable of torch control. Users can turn the camera's torch on or off with the click of a button. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Computer Vision\readme.es.md # VisioForge Video Capture SDK .Net ## Computer Vision Demo Este ejemplo de SDK muestra una completa aplicación de Visión por Computador desarrollada con la tecnología VisioForge Video Capture SDK .Net. Muestra la integración de funcionalidades de captura y reproducción de vídeo junto con funciones avanzadas de visión por computador como la detección de caras, peatones y coches. La aplicación permite el análisis en tiempo real y la detección de objetos en secuencias de vídeo. Los usuarios pueden seleccionar fuentes de vídeo, aplicar filtros de detección de objetos y ver el vídeo procesado en una interfaz Windows Forms. La muestra de código incluye la implementación detallada para inicializar, configurar y gestionar el ciclo de vida de los dispositivos de captura de vídeo, reproductores multimedia y diversos algoritmos de detección. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Computer Vision\readme.md # VisioForge Video Capture SDK .Net ## Computer Vision Demo This SDK sample demonstrates a comprehensive Computer Vision application developed with VisioForge Video Capture SDK .Net technology. It showcases the integration of video capture and media playback functionalities alongside advanced computer vision features such as face, pedestrian, and car detection. The application allows for real-time analysis and object detection within video streams. Users can select video sources, apply object detection filters, and view the processed video output within a Windows Forms interface. The code sample includes the detailed implementation for initializing, configuring, and managing the lifecycle of video capture devices, media players, and various detection algorithms. --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Decklink Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Decklink Demo (C#/WinForms) El código proporcionado es una muestra completa para una aplicación de captura de vídeo utilizando el VisioForge Video Capture SDK .Net. Demuestra la inicialización y configuración de una sesión de captura de vídeo con soporte para tarjetas Decklink, la configuración de varios efectos de vídeo y audio, y la captura a diferentes formatos como MP4, AVI, WMV, y más. El código incluye funciones para ajustar las propiedades del vídeo (por ejemplo, brillo, saturación, contraste), añadir logotipos y realizar capturas de pantalla durante la sesión de captura. Además, muestra cómo manejar la amplificación de audio y la selección de dispositivos de salida de audio, proporcionando un conjunto completo de características necesarias para la creación de aplicaciones avanzadas de captura de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Decklink Demo\readme.md # VisioForge Video Capture SDK .Net ## Decklink Demo (C#/WinForms) The provided code is a comprehensive sample for a video capture application using the VisioForge Video Capture SDK .Net. It demonstrates initializing and configuring a video capture session with support for Decklink cards, setting up various video and audio effects, and capturing to different formats like MP4, AVI, WMV, and more. The code includes functionality for adjusting video properties (e.g., brightness, saturation, contrast), adding logos, and taking screenshots during the capture session. Additionally, it showcases how to handle audio amplification and selection of audio output devices, providing a full suite of features necessary for creating advanced video capturing applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\DV Capture\readme.es.md # VisioForge Video Capture SDK .Net ## DV Capture Demo (C#/WinForms) El ejemplo muestra la inicialización y gestión de las operaciones de captura de vídeo, incluida la selección de dispositivos, la configuración de formatos, la aplicación de efectos y el ajuste de la configuración de salida. El código encapsula varias funciones, como la previsualización y grabación de vídeo en tiempo real, la captura de pantalla y el control directo de la reproducción de la videocámara DV. Además, proporciona interfaces para configurar los efectos de vídeo, ajustar la configuración de audio y gestionar los errores con elegancia, ofreciendo una base sólida para crear sofisticadas aplicaciones de captura de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\DV Capture\readme.md # VisioForge Video Capture SDK .Net ## DV Capture Demo (C#/WinForms) The sample showcases the initialization and management of video capture operations, including device selection, format configuration, effects application, and output settings adjustment. The code encapsulates various features such as real-time video preview ad recording, screenshot capturing, and direct control over DV camcorder playback. Additionally, it provides interfaces for configuring video effects, adjusting audio settings, and handling errors gracefully, offering a robust foundation for building sophisticated video capture applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\IP Capture\readme.es.md # VisioForge Video Capture SDK .Net ## IP Capture Demo (C#/WinForms) La muestra muestra un ejemplo completo de integración de funcionalidades de cámaras IP en una aplicación .NET utilizando el SDK de Captura de Vídeo VisioForge .Net. Esta demo muestra cómo capturar vídeo desde cámaras IP, incluyendo soporte para cámaras ONVIF, varios formatos de salida (MP4, AVI, WMV, GIF, etc.), aceleración por hardware y efectos de vídeo. Los usuarios pueden interactuar con los controles PTZ de la cámara (si están disponibles), configurar los ajustes de vídeo y audio mediante diálogos y gestionar los eventos de desconexión de la fuente de red. Además, permite grabar en un archivo con los códecs y formatos seleccionados, realizar capturas de pantalla y ajustar las propiedades de vídeo como el brillo, el contraste y la saturación sobre la marcha. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\IP Capture\readme.md # VisioForge Video Capture SDK .Net ## IP Capture Demo (C#/WinForms) The sample demonstrates a comprehensive example of integrating IP camera functionalities into a .NET application using the VisioForge Video Capture SDK .Net. This demo showcases how to capture video from IP cameras, including support for ONVIF cameras, various output formats (MP4, AVI, WMV, GIF, etc.), hardware acceleration, and video effects. Users can interact with the camera's PTZ controls (if available), configure video and audio settings through dialogues, and manage network source disconnect events. Additionally, it features recording to a file with selected codecs and formats, taking screenshots, and adjusting video properties like brightness, contrast, and saturation on the fly. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\madVR demo\readme.es.md # VisioForge Video Capture SDK .Net ## madVR Demo (C#/WinForms) El código proporcionado es una muestra completa del SDK para una aplicación de captura de vídeo utilizando el VisioForge Video Capture SDK .Net, adaptado para la integración con madVR para un mejor renderizado de vídeo. Demuestra la inicialización y disposición del motor de captura de vídeo, la selección de dispositivos (tanto de audio como de vídeo), la configuración de formatos y la captura de vídeo en tiempo real con gestión de audio. Esta muestra está diseñada para desarrolladores que buscan incorporar funciones avanzadas de captura y renderizado de vídeo en sus aplicaciones .NET, mostrando una implementación práctica de las capacidades del SDK en la gestión de dispositivos, formatos y parámetros de captura. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\madVR demo\readme.md # VisioForge Video Capture SDK .Net ## madVR Demo (C#/WinForms) The provided code is a comprehensive SDK sample for a video capture application using the VisioForge Video Capture SDK .Net, tailored for integration with madVR for enhanced video rendering. It demonstrates the initialization and disposal of the video capture engine, device selection (both audio and video), format configuration, and real-time video capture with audio management. This sample is designed for developers looking to incorporate advanced video capture and rendering features into their .NET applications, showcasing a practical implementation of the SDK's capabilities in managing devices, formats, and capturing parameters. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Main Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Main Demo (C#/WinForms) La demo muestra todas las funcionalidades principales de Video Capture SDK .Net. Usted puede: * previsualizar o capturar video desde webcams, cámaras IP, pantallas, dispositivos Decklink, y algunas otras fuentes * Aplicar efectos de vídeo y audio * realizar streaming en red * guardar vídeo y audio en MP4, WMV, WebM, AVI, AAC, MP3, y muchos otros formatos de salida * Aplicar OSD * Utilizar Picture-in-Picture * Detectar movimiento * reconocer códigos de barras * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Main Demo\readme.md # VisioForge Video Capture SDK .Net ## Main Demo (C#/WinForms) The demo shows all the primary functionality of Video Capture SDK .Net. You can: * preview or capture video from webcams, IP cameras, screens, Decklink devices, and some other sources * apply video and audio effects * perform network streaming * save video and audio to MP4, WMV, WebM, AVI, AAC, MP3, and many other output formats * apply OSD * use Picture-in-Picture * detect motion * recognize barcodes * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Multiple IP cams\readme.es.md # VisioForge Video Capture SDK .Net ## Multiple IP Cameras Demo (C#/WinForms) El ejemplo de código proporcionado es para una aplicación Windows Forms que utiliza VisioForge Video Capture SDK .Net para gestionar múltiples flujos de cámaras IP simultáneamente. La aplicación muestra cómo crear, configurar y controlar instancias de captura de vídeo para dos cámaras IP, incluyendo el inicio y la detención de los flujos de vídeo, la gestión de errores y la actualización de los componentes de la interfaz de usuario con información de flujo como el tiempo de grabación. El código aprovecha los patrones de programación asíncrona para inicializar y controlar los motores de captura de vídeo, demostrando las capacidades de gestión de errores y registro de depuración. Además, incluye funcionalidades para disponer de los recursos adecuadamente al cerrar la aplicación, asegurando cierres limpios y gestión de recursos. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Multiple IP cams\readme.md # VisioForge Video Capture SDK .Net ## Multiple IP Cameras Demo (C#/WinForms) The provided code sample is for a Windows Forms application using the VisioForge Video Capture SDK .Net to manage multiple IP camera streams simultaneously. The application showcases how to create, configure, and control video capture instances for two IP cameras, including starting and stopping the video streams, handling errors, and updating UI components with stream information such as recording time. The code leverages asynchronous programming patterns for initializing and controlling the video capture engines, demonstrating error handling and debug logging capabilities. Additionally, it includes functionality to dispose of resources properly upon closing the application, ensuring clean shutdowns and resource management. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Multiple video streams\readme.es.md # VisioForge Video Capture SDK .Net ## Multiple Video Streams Demo (C#/WinForms) Este ejemplo de SDK muestra cómo implementar una aplicación de captura de múltiples flujos de vídeo utilizando el SDK de captura de vídeo VisioForge .Net. Muestra la configuración y el uso de múltiples dispositivos de captura de vídeo, la configuración de formatos de vídeo y frecuencias de cuadro, y el manejo de fuentes de imagen en imagen (PIP). Además, el código incluye la gestión de errores y el registro, demuestra cómo iniciar y detener la captura de vídeo y actualiza dinámicamente los elementos de la interfaz de usuario en función del estado de la captura. La aplicación aprovecha la programación asíncrona para crear y gestionar el motor de captura de vídeo, garantizando una interfaz de usuario con capacidad de respuesta. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Multiple video streams\readme.md # VisioForge Video Capture SDK .Net ## Multiple Video Streams Demo (C#/WinForms) This SDK sample demonstrates how to implement a multiple video stream capture application using the VisioForge Video Capture SDK .Net. It showcases the setup and use of multiple video capture devices, the configuration of video formats and frame rates, and the handling of picture-in-picture (PIP) sources. Additionally, the code includes error handling and logging, demonstrates how to start and stop video capture, and dynamically updates UI elements based on the capture status. The application leverages asynchronous programming to create and manage the video capture engine, ensuring a responsive user interface. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Multiple web cams\readme.es.md # VisioForge Video Capture SDK .Net ## Multiple Web Cameras Demo (C#/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación Windows Forms para gestionar múltiples dispositivos de captura de vídeo utilizando el SDK de captura de vídeo VisioForge .Net. Incluye funciones para inicializar y desechar motores de captura de vídeo, configurar dispositivos de vídeo, iniciar y detener una vista previa de vídeo y gestionar errores. La aplicación permite seleccionar diferentes dispositivos de vídeo y configuraciones, como el formato de vídeo y la velocidad de fotogramas, para cada instancia de captura de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Multiple web cams\readme.md # VisioForge Video Capture SDK .Net ## Multiple Web Cameras Demo (C#/WinForms) This SDK sample demonstrates how to create a Windows Forms application for managing multiple video capture devices using the VisioForge Video Capture SDK .Net. It includes functionality for initializing and disposing of video capture engines, configuring video devices, starting and stopping a video preview, and handling errors. The application supports selecting different video devices and configurations, such as video format and frame rate, for each video capture instance. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\NDI Source\readme.es.md # VisioForge Video Capture SDK .Net ## NDI Source Demo (C#/WinForms) El fragmento de código proporcionado es un ejemplo de integración del SDK de captura de vídeo .Net de VisioForge en una aplicación Windows Forms para capturar, grabar y transmitir vídeo. La aplicación inicializa el motor de captura de vídeo de forma asíncrona, admite la gestión de errores y permite a los usuarios seleccionar fuentes NDI para la entrada de vídeo. Cuenta con una interfaz gráfica de usuario para configurar los ajustes de captura, como el formato de salida y la ubicación del archivo, e incluye funciones de previsualización y grabación de vídeo en tiempo real. Además, la aplicación demuestra cómo actualizar los elementos de la interfaz de usuario con el tiempo de grabación y gestionar el inicio y la detención asíncronos de la captura de vídeo, mostrando las capacidades del SDK para gestionar eficazmente las operaciones de entrada y salida de vídeo en un entorno .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\NDI Source\readme.md # VisioForge Video Capture SDK .Net ## NDI Source Demo (C#/WinForms) The provided code snippet is an example of integrating VisioForge's Video Capture SDK .Net into a Windows Forms application for capturing, recording, and streaming video. The application initializes the video capture engine asynchronously, supports error handling, and allows users to select NDI sources for video input. It features a GUI for configuring capture settings, such as output format and file location, and includes real-time video preview and recording functionalities. Additionally, the application demonstrates how to update UI elements with recording time and handle asynchronous start and stop of video capture, showcasing the SDK's capabilities in handling video input and output operations efficiently within a .NET environment. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\NDI Streamer\readme.es.md # VisioForge Video Capture SDK .Net ## NDI Streamer Demo (C#/WinForms) Este ejemplo muestra cómo crear una aplicación Windows Forms para la transmisión de vídeo y audio NDI utilizando el SDK .NET de VisioForge. Muestra la inicialización y configuración de dispositivos de captura de vídeo y audio, incluyendo la selección de dispositivos y formatos y la configuración de ajustes específicos del dispositivo a través de una interfaz fácil de usar. Además, la aplicación implementa el streaming NDI (Network Device Interface), que permite la difusión de vídeo y audio de alta calidad a través de una red. Los usuarios pueden iniciar, pausar, reanudar y detener la transmisión con sólo pulsar un botón, además de controlar el tiempo de grabación. El código incluye la gestión de errores y el registro, asegurando una experiencia de usuario sin problemas. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\NDI Streamer\readme.md # VisioForge Video Capture SDK .Net ## NDI Streamer Demo (C#/WinForms) This sample demonstrates how to create a Windows Forms application for NDI streaming video and audio using the VisioForge .NET SDK. It showcases the initialization and configuration of video and audio capture devices, including selecting devices and formats and configuring device-specific settings through a user-friendly interface. Additionally, the application implements NDI (Network Device Interface) streaming, allowing for high-quality video and audio broadcasting over a network. Users can start, pause, resume, and stop the stream with simple button clicks while also monitoring the recording time. The code includes error handling and logging, ensuring a smooth user experience. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Push Source Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Push Source Demo (C#/WinForms) Este ejemplo del SDK muestra una completa aplicación de captura de vídeo utilizando el VisioForge Video Capture SDK .Net. La aplicación, construida en C# dentro de un entorno Windows Forms, muestra diversas funcionalidades como la inicialización y disposición del motor de captura de vídeo, la configuración de los ajustes de salida para diferentes formatos (MP4, AVI, WMV, GIF, etc.) y el empuje de fotogramas de vídeo en tiempo real. Proporciona una interfaz de usuario para seleccionar los formatos de salida, configurar los ajustes del codificador mediante cuadros de diálogo y mostrar el tiempo de grabación. Además, incluye funciones de gestión de errores y registro para ayudar en la depuración. Este ejemplo es ideal para desarrolladores que deseen integrar funciones avanzadas de captura y procesamiento de vídeo en sus aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Push Source Demo\readme.md # VisioForge Video Capture SDK .Net ## Push Source Demo (C#/WinForms) This SDK sample demonstrates a comprehensive video capture application using the VisioForge Video Capture SDK .Net. The application, built in C# within a Windows Forms environment, showcases various functionalities such as initializing and disposing of the video capture engine, configuring output settings for different formats (MP4, AVI, WMV, GIF, etc.), and real-time video frame pushing. It provides a user interface for selecting output formats, configuring encoder settings through dialog boxes, and displaying the recording time. Additionally, it includes error handling and logging capabilities to assist in debugging. This sample is ideal for developers looking to integrate advanced video capture and processing features into their .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Screen Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Screen Capture Demo (C#/WinForms) La aplicación de muestra es un ejemplo completo que demuestra las capacidades de VisioForge Video Capture SDK .Net para la grabación de pantalla, captura de audio y streaming. Muestra la integración de varios diálogos de configuración de vídeo y audio, como MP4, AVI, WMV, GIF y configuraciones de codificador de hardware. La demostración incluye funciones para capturar actividades en pantalla, incluidas ventanas específicas o toda la pantalla, con opciones para incluir los puntos destacados del cursor del ratón y seleccionar dispositivos y formatos de entrada de audio. Además, incluye una configuración de transmisión en red con RTMP y FFMPEG, lo que demuestra la versatilidad del SDK a la hora de gestionar tareas de creación, manipulación y distribución de contenidos multimedia. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Screen Capture\readme.md # VisioForge Video Capture SDK .Net ## Screen Capture Demo (C#/WinForms) The sample app is a comprehensive example demonstrating the capabilities of the VisioForge Video Capture SDK .Net for screen recording, audio capture, and streaming. It showcases the integration of various video and audio settings dialogs, such as MP4, AVI, WMV, GIF settings, and hardware encoder configurations. The demo includes functionality for capturing screen activities, including specific windows or the entire screen, with options for including mouse cursor highlights and selecting audio input devices and formats. Additionally, it features a network streaming setup with RTMP and FFMPEG, demonstrating the SDK's versatility in handling multimedia content creation, manipulation, and distribution tasks. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Separate Capture Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Separate Capture Demo (C#/WinForms) Este ejemplo de SDK muestra la implementación de una completa aplicación de captura de vídeo utilizando el SDK de captura de vídeo VisioForge .Net. La aplicación, encapsulada en una interfaz Windows Forms, permite a los usuarios configurar y controlar sesiones de grabación de vídeo. Entre sus funciones se incluyen la selección de dispositivos de entrada de vídeo y audio, la configuración de los formatos de salida (como AVI, WMV, MP4, etc.), el manejo de la configuración del codificador de hardware y la compatibilidad con ajustes de captura dinámicos como inicio, parada, pausa y reanudación. Además, la aplicación ofrece diálogos de configuración avanzada para un control preciso de los parámetros de vídeo y audio, junto con actualizaciones en tiempo real de la duración de la grabación. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Separate Capture Demo\readme.md # VisioForge Video Capture SDK .Net ## Separate Capture Demo (C#/WinForms) This SDK sample demonstrates the implementation of a comprehensive video capture application using the VisioForge Video Capture SDK .Net. The application, encapsulated within a Windows Forms interface, allows users to configure and control video recording sessions. Features include selecting video and audio input devices, configuring output formats (such as AVI, WMV, MP4, etc.), handling hardware encoder settings, and supporting dynamic capture adjustments like start, stop, pause, and resume. Additionally, the application offers advanced settings dialogs for precise control over video and audio parameters, alongside real-time updates on recording duration. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Simple VideoCapture\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WinForms) La muestra es un ejemplo completo que ilustra cómo utilizar VisioForge Video Capture SDK .Net para capturar vídeo y audio de diversas fuentes, aplicar efectos de audio y vídeo y guardar el resultado en diferentes formatos. Esta aplicación C# demuestra la integración de múltiples funcionalidades, incluyendo selección y configuración de dispositivos, previsualización de vídeo en tiempo real, aplicación de efectos de audio y vídeo, y personalización de ajustes de codificación para formatos de salida como MP4, AVI, WMV, GIF, MOV, y otros. Presenta características avanzadas como soporte de aceleración por hardware, efectos de audio como amplificación, ecualización y bajos reales, junto con efectos de vídeo y superposiciones de texto/imagen, proporcionando una base sólida para construir aplicaciones robustas de captura y procesamiento de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Simple VideoCapture\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WinForms) The sample is a comprehensive example illustrating how to utilize the VisioForge Video Capture SDK .Net for capturing video and audio from various sources, applying audio and video effects, and saving the output in different formats. This C# application demonstrates the integration of multiple functionalities, including device selection and configuration, real-time video preview, audio and video effects application, and encoding settings customization for output formats like MP4, AVI, WMV, GIF, MOV, and others. It showcases advanced features like hardware acceleration support, audio effects like amplification, equalization, and true bass, along with video effects and text/image overlays, providing a solid foundation for building robust video capture and processing applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Timeshift Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Timeshift Demo (C#/WinForms) La aplicación de ejemplo muestra la integración y el uso de los SDK VisioForge Video Capture y Media Player para la captura de vídeo en tiempo real, la reproducción y el Timeshifting. Incluye la inicialización asíncrona de los motores de captura de vídeo y reproducción multimedia, la gestión de eventos para errores y creación de archivos durante el time-shifting, y elementos de interfaz de usuario para seleccionar dispositivos de entrada de vídeo/audio y configurar los ajustes de salida. La aplicación muestra funciones como la selección de dispositivos de captura, la configuración de formatos de vídeo/audio y la captura en un búfer de Timeshift para la reproducción diferida, destacando las capacidades del SDK en el manejo de tareas multimedia complejas. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Timeshift Demo\readme.md # VisioForge Video Capture SDK .Net ## Timeshift Demo (C#/WinForms) The sample application demonstrates the integration and usage of the VisioForge Video Capture and Media Player SDKs for real-time video capture, playback, and timeshifting. It includes asynchronous initialization of video capture and media player engines, event handling for errors and file creation during time-shifting, and UI elements for selecting video/audio input devices and configuring output settings. The application showcases features like selecting capture devices, configuring video/audio formats, and capturing to a timeshift buffer for delayed playback, highlighting the SDK's capabilities in handling complex multimedia tasks. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\UDP Streamer\readme.es.md # VisioForge Video Capture SDK .Net ## UDP Streamer Demo (C#/WinForms) Este SDK de ejemplo demuestra la integración del SDK de Captura de Vídeo .Net de VisioForge en una aplicación Windows Forms para crear un streamer UDP. El código muestra la inicialización del entorno de captura de vídeo, la selección de dispositivos de entrada de audio y vídeo, la configuración de los formatos de entrada y la gestión de los ajustes de transmisión en red para la difusión UDP mediante FFMPEG. Incluye funciones para iniciar, detener, pausar y reanudar la captura de vídeo, así como para ajustar la configuración específica del dispositivo y mostrar el tiempo de grabación. Este ejemplo es una guía completa para desarrolladores que deseen implementar soluciones de captura y transmisión de vídeo en tiempo real en sus aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\UDP Streamer\readme.md # VisioForge Video Capture SDK .Net ## UDP Streamer Demo (C#/WinForms) This sample SDK demonstrates the integration of VisioForge's Video Capture SDK .Net in a Windows Forms application to create a UDP streamer. The code showcases initializing the video capture environment, selecting audio and video input devices, configuring input formats, and managing network streaming settings for UDP broadcasting using FFMPEG. It includes features for starting, stopping, pausing, and resuming video capture, as well as adjusting device-specific settings and displaying the recording time. This example is a comprehensive guide for developers looking to implement real-time video capture and streaming solutions in their .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Video From Images Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Video From Images Demo (C#/WinForms) El ejemplo muestra cómo crear vídeos a partir de una colección de imágenes utilizando VisioForge Video Capture SDK .NET. Este ejemplo muestra la inicialización del motor de captura de vídeo, la carga de imágenes desde una carpeta especificada y la configuración de los parámetros de salida de vídeo, incluida la resolución y la duración de los fotogramas. Incluye la gestión de eventos para errores y el procesamiento de mapas de bits de fotogramas de vídeo, lo que permite la composición dinámica de fotogramas de vídeo a partir de imágenes fijas. La interfaz gráfica de usuario ofrece opciones para seleccionar las carpetas de entrada y las rutas de los archivos de salida, lo que facilita la creación de compilaciones de vídeo personalizadas. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Video From Images Demo\readme.md # VisioForge Video Capture SDK .Net ## Video From Images Demo (C#/WinForms) The sample showcases how to create videos from a collection of images using the VisioForge Video Capture SDK .NET. This example demonstrates initializing the video capture engine, loading images from a specified folder, and configuring video output settings, including resolution and frame duration. It features event handling for errors and video frame bitmap processing, allowing for the dynamic composition of video frames from still images. The GUI provides options for selecting input folders and output file paths, making it user-friendly for creating custom video compilations. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Virtual Camera Streamer\readme.es.md # VisioForge Video Capture SDK .Net ## Virtual Camera Streamer Demo (C#/WinForms) La muestra ofrece una solución completa para integrar la transmisión de vídeo/audio a los dispositivos virtuales de Virtual Camera SDK. Proporciona una interfaz rica en funciones para gestionar dispositivos de entrada de vídeo y audio, incluida la selección de formatos de entrada, velocidades de fotogramas y ajustes específicos del dispositivo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Virtual Camera Streamer\readme.md # VisioForge Video Capture SDK .Net ## Virtual Camera Streamer Demo (C#/WinForms) The sample offers a comprehensive solution for integrating video/audio streaming to a Virtual Camera SDK virtual devices. It provides a feature-rich interface for managing video and audio input devices, including the selection of input formats, frame rates, and device-specific settings. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Window Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Window Capture Demo (C#/WinForms) El fragmento de código proporcionado se puede utilizar para capturar y grabar el contenido de la pantalla/ventana en varios formatos de vídeo, incluyendo AVI, WMV, MP4, MPEGTS, MOV y GIF, utilizando el VisioForge Video Capture SDK .Net. La ventana puede establecerse por título o por handle (HWND). Se muestra la implementación de la aplicación WinForms que aprovecha el SDK para inicializar y configurar los ajustes de captura de vídeo, manejar la selección del formato de salida y gestionar el proceso de captura de vídeo, incluyendo la funcionalidad de inicio y parada. El código también demuestra la integración de cuadros de diálogo personalizados para configurar ajustes de salida específicos para diferentes formatos y gestionar los ajustes de origen de la captura de pantalla, destacando la flexibilidad del SDK en las tareas de captura y grabación. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\CSharp\Window Capture\readme.md # VisioForge Video Capture SDK .Net ## Window Capture Demo (C#/WinForms) The provided code snippet can be used to capture and record screen/window content in various video formats, including AVI, WMV, MP4, MPEGTS, MOV, and GIF, using the VisioForge Video Capture SDK .Net. The window can be set by title or handle (HWND). It showcases the implementation of the WinForms application that leverages the SDK to initialize and configure video capture settings, handle output format selection, and manage the video capture process, including start and stop functionality. The code also demonstrates the integration of custom dialogs for configuring specific output settings for different formats and handling screen capture source settings, emphasizing the SDK's flexibility in capturing and recording tasks. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\Audio Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Audio Capture Demo (VB.Net/WinForms) Este ejemplo muestra cómo utilizar VisioForge Video Capture SDK .Net para crear una aplicación avanzada de grabación de audio en VB.NET. La aplicación permite seleccionar dispositivos y formatos de entrada de audio y configurar efectos de audio como amplificación, ecualización, true bass, pitch shift y sonido 3D. Admite varios formatos de salida, como MP3, WMA, OGG, FLAC y M4A, con ajustes personalizables para cada formato. Además, la aplicación ofrece funciones de procesamiento de audio en tiempo real y muestra el tiempo de grabación. El código incluye controladores de eventos para capturar fotogramas de audio, gestionar errores y administrar la configuración del dispositivo de audio, lo que demuestra la versatilidad del SDK para proyectos relacionados con el audio. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\Audio Capture\readme.md # VisioForge Video Capture SDK .Net ## Audio Capture Demo (VB.Net/WinForms) This sample demonstrates how to use the VisioForge Video Capture SDK .Net to build an advanced audio recording application in VB.NET. The application allows you to select audio input devices and formats and configure audio effects like amplification, equalization, true bass, pitch shift, and 3D sound. It supports multiple output formats, including MP3, WMA, OGG, FLAC, and M4A, with customizable settings for each format. Additionally, the application offers real-time audio processing capabilities and displays the recording time. The code includes event handlers for capturing audio frames, handling errors, and managing audio device settings, showcasing the SDK's versatility for audio-related projects. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\DV Capture\readme.es.md # VisioForge Video Capture SDK .Net ## DV Capture Demo (VB.Net/WinForms) El código proporcionado muestra una aplicación avanzada de captura de vídeo desarrollada utilizando el VisioForge Video Capture SDK .Net. Esta aplicación cuenta con una completa interfaz de usuario para configurar varios ajustes de captura de vídeo, incluyendo la selección de dispositivos, ajustes de formato, y la aplicación de efectos de vídeo. Admite la captura desde dispositivos de vídeo, la grabación de audio y el almacenamiento de la salida en varios formatos, como AVI, WMV, MP4, MPEG-TS, MOV y GIF. El código también demuestra el manejo de los cuadros de diálogo de configuración del dispositivo de captura, la aplicación de efectos de vídeo en tiempo real y la implementación de un temporizador de grabación. Los usuarios pueden personalizar la configuración de salida mediante cuadros de diálogo para formatos específicos, incluidas opciones de codificación acelerada por hardware para MP4 y MPEG-TS. Además, la aplicación ofrece funcionalidad para realizar capturas de pantalla durante el proceso de captura e incluye un registro para capturar eventos y errores. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\DV Capture\readme.md # VisioForge Video Capture SDK .Net ## DV Capture Demo (VB.Net/WinForms) The provided code showcases an advanced video capture application developed using the VisioForge Video Capture SDK .Net. This application features a comprehensive user interface for configuring various video capture settings, including device selection, format adjustments, and applying video effects. It supports capturing from video devices, recording audio, and saving the output in multiple formats such as AVI, WMV, MP4, MPEG-TS, MOV, and GIF. The code also demonstrates the handling of capture device settings dialogs, real-time video effects application, and the implementation of a recording timer. Users can customize output settings through dialogs for specific formats, including hardware-accelerated encoding options for MP4 and MPEG-TS. Additionally, the application offers functionality for taking screenshots during the capture process and includes a log for capturing events and errors. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\IP Capture\readme.es.md # VisioForge Video Capture SDK .Net ## IP Capture Demo (VB.Net/WinForms) Este ejemplo del SDK muestra cómo implementar funciones avanzadas de captura de vídeo utilizando el SDK de captura de vídeo VisioForge .Net en una aplicación Windows Forms. Muestra la creación y gestión de varios formatos de salida (p. ej., MP4, AVI, WMV, GIF), diálogos de configuración de codificación de hardware y efectos de vídeo (p. ej., luminosidad, saturación, contraste). Además, incluye el manejo de cámaras IP compatibles con ONVIF, la captura de capturas de pantalla y la adición de logotipos de texto o imágenes al vídeo. La muestra proporciona una interfaz de usuario completa para configurar los ajustes de captura de vídeo, iniciar/detener la captura y ajustar dinámicamente los efectos de vídeo. También demuestra la integración de la gestión de fuentes de red, incluyendo la conexión de cámaras ONVIF y la gestión de desconexiones de fuentes de red. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\IP Capture\readme.md # VisioForge Video Capture SDK .Net ## IP Capture Demo (VB.Net/WinForms) This SDK sample demonstrates how to implement advanced video capture features using the VisioForge Video Capture SDK .Net in a Windows Forms application. It showcases the creation and management of various output formats (e.g., MP4, AVI, WMV, GIF), hardware encoding settings dialogs, and video effects (e.g., lightness, saturation, contrast). Additionally, it includes handling IP cameras with ONVIF support, capturing screenshots, and adding text or image logos to the video. The sample provides a comprehensive UI for configuring video capture settings, starting/stopping capture, and dynamically adjusting video effects. It also demonstrates the integration of network source management, including ONVIF camera connection and handling network source disconnections. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\Main Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Main Demo (VB.Net/WinForms) La demo muestra la mayor funcionalidad de Video Capture SDK .Net, usando VB.Net. Usted puede: * previsualizar o capturar video desde webcams, cámaras IP, pantallas, dispositivos Decklink, y algunas otras fuentes * Aplicar efectos de vídeo y audio * realizar streaming en red * guardar vídeo y audio en varios formatos de salida * Aplicar OSD * Utilizar Picture-in-Picture * Detectar movimiento * reconocer códigos de barras * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\Main Demo\readme.md # VisioForge Video Capture SDK .Net ## Main Demo (VB.Net/WinForms) The demo shows the most functionality of Video Capture SDK .Net, using VB.Net. You can: * preview or capture video from webcams, IP cameras, screens, Decklink devices, and some other sources * apply video and audio effects * perform network streaming * save video and audio to various output formats * apply OSD * use Picture-in-Picture * detect motion * recognize barcodes * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\Screen Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Screen Capture Demo (VB.Net/WinForms) Este ejemplo demuestra el uso de VisioForge Video Capture SDK .Net para crear una completa aplicación de captura y procesamiento de vídeo en VB.NET. Muestra la configuración de varios parámetros de captura de vídeo y audio, incluida la captura de pantalla, la configuración de la entrada de audio, los efectos de vídeo y la selección del formato de salida (AVI, WMV, MP4, GIF, etc.). La aplicación permite iniciar, pausar, reanudar y detener la captura de vídeo, así como configurar los ajustes de codificación del hardware para diferentes formatos. Además, cuenta con captura de pantalla desde ventanas específicas o a pantalla completa, personalización de los ajustes de captura de audio, aplicación de efectos de vídeo en tiempo real, superposición de imágenes y texto, y funciones de guardado de capturas de pantalla. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\Screen Capture\readme.md # VisioForge Video Capture SDK .Net ## Screen Capture Demo (VB.Net/WinForms) This sample demonstrates the use of the VisioForge Video Capture SDK .Net to create a comprehensive video capture and processing application in VB.NET. It showcases the setup of various video and audio capture settings, including screen capture, audio input configuration, video effects, and output format selection (AVI, WMV, MP4, GIF, etc.). The application provides functionality for starting, pausing, resuming, and stopping video capture, as well as configuring hardware encoding settings for different formats. Additionally, it features screen capture from specific windows or fullscreen, audio capture settings customization, real-time video effects application, image and text overlay, and screenshot saving capabilities. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\Simple Video Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (VB.Net/WinForms) Este ejemplo muestra la integración y el uso de VisioForg Video Capture SDK .Net en una aplicación VB .NET para capturar, procesar y guardar secuencias de vídeo y audio. Incluye la configuración de dispositivos de audio y vídeo, la configuración de varios efectos de audio (como amplificación, ecualización y refuerzo de graves) y la selección de formatos de salida (AVI, WMV, MP4, etc.). La muestra también muestra cómo aplicar efectos de vídeo, manejar los ajustes de captura de vídeo y gestionar la salida de archivos a través de cuadros de diálogo. También se cubren funciones avanzadas como la codificación acelerada por hardware, la visualización de audio y la captura de fotogramas, proporcionando un ejemplo completo de las capacidades del SDK para crear aplicaciones complejas de captura y procesamiento de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinForms\VB.Net\Simple Video Capture\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (VB.Net/WinForms) This sample demonstrates the integration and use of VisioForg Video Capture SDK .Net in a VB .NET application to capture, process, and save video and audio streams. It includes setting up audio and video devices, configuring various audio effects (like amplification, equalization, and bass boost), and selecting output formats (AVI, WMV, MP4, etc.). The sample also showcases how to apply video effects, handle video capture settings, and manage file output through dialogs. Advanced features like hardware-accelerated encoding, audio visualization, and frame capture are also covered, providing a comprehensive example of the SDK's capabilities for building complex video capture and processing applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinUI\CSharp\Simple Video Capture Demo WinUI\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WinUI) El fragmento de código proporcionado es para una sencilla aplicación de captura de vídeo utilizando el VisioForge Video Capture SDK .Net y WinUI 3 para el desarrollo de escritorio. Esta aplicación muestra la integración de la funcionalidad de captura de vídeo, incluyendo la selección de dispositivos, efectos de vídeo (por ejemplo, luminosidad, saturación, contraste), ajuste de la configuración de audio y personalización del formato de salida. Los usuarios pueden seleccionar dispositivos de vídeo y audio, configurar sus propiedades, aplicar varios efectos de vídeo y elegir el formato de salida deseado para sus grabaciones, como AVI, WMV, MP4, GIF, TS o MOV. El ejemplo también muestra cómo manejar las interacciones del usuario para iniciar, pausar, reanudar y detener la captura de vídeo, además de guardar los vídeos capturados en una ubicación especificada. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WinUI\CSharp\Simple Video Capture Demo WinUI\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WinUI) The provided code snippet is for a simple video capture application using the VisioForge Video Capture SDK .Net and WinUI 3 for desktop development. This application showcases the integration of video capture functionality, including device selection, video effects (e.g., lightness, saturation, contrast), audio settings adjustment, and output format customization. Users can select video and audio devices, configure their properties, apply various video effects, and choose the desired output format for their recordings, such as AVI, WMV, MP4, GIF, TS, or MOV. The sample also demonstrates handling user interactions for starting, pausing, resuming, and stopping video capture, along with saving captured videos to a specified location. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Audio_Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Audio Capture Demo (C#/WPF) El fragmento de código proporcionado esboza una completa aplicación de captura de audio construida utilizando el VisioForge Video Capture SDK .Net. Esta aplicación muestra varias capacidades, incluyendo la inicialización y configuración de dispositivos de captura de audio, la aplicación de efectos de audio (como amplificación, ecualización, sonido 3D y true bass) y la selección de formatos de salida de audio (ACM, MP3, WMA, OGG Vorbis, FLAC, Speex y M4A). Cuenta con una interfaz gráfica de usuario para interacciones con el usuario, como la elección de dispositivos de audio, el ajuste de la configuración de audio y la selección de formatos de salida. Además, la aplicación admite modos de previsualización y grabación de audio, y ofrece cuadros de diálogo para configurar ajustes específicos de los códecs. El código hace hincapié en el procesamiento y grabación de audio en tiempo real, mostrando la flexibilidad del SDK VisioForge en el manejo de diferentes fuentes y formatos de audio. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Audio_Capture\readme.md # VisioForge Video Capture SDK .Net ## Audio Capture Demo (C#/WPF) The provided code snippet outlines a comprehensive audio capture application built using the VisioForge Video Capture SDK .Net. This application showcases various capabilities, including the initialization and configuration of audio capture devices, audio effects application (such as amplification, equalization, 3D sound, and true bass), and the selection of audio output formats (ACM, MP3, WMA, OGG Vorbis, FLAC, Speex, and M4A). It features a graphical user interface for user interactions, such as choosing audio devices, adjusting audio settings, and selecting output formats. Additionally, the application supports audio preview and recording modes, and it offers dialogs for configuring codec-specific settings. The code emphasizes real-time audio processing and recording, showcasing the VisioForge SDK's flexibility in handling different audio sources and formats. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\DV_Capture\readme.es.md # VisioForge Video Capture SDK .Net ## DV Capture Demo (C#/WPF) El fragmento de código proporcionado corresponde a una aplicación de Windows que utiliza VisioForge Video Capture SDK .Net, diseñada específicamente para capturar y procesar secuencias de vídeo. Esta aplicación presenta varias funciones como la captura de vídeo desde videocámaras DV, la manipulación de efectos de vídeo y salidas como MP4, AVI, WMV, GIF, etc. Incluye control directo sobre las funciones de reproducción y grabación de vídeo, como reproducción, pausa, parada, rebobinado y avance rápido. Además, la aplicación muestra el manejo de entradas/salidas de audio y vídeo, incluyendo ajustes de volumen y balance, selección de dispositivos y formatos de entrada de vídeo y efectos de vídeo en tiempo real como filtros de escala de grises, saturación, volteo y desentrelazado. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\DV_Capture\readme.md # VisioForge Video Capture SDK .Net ## DV Capture Demo (C#/WPF) The provided code snippet is for a Windows application using VisioForge Video Capture SDK .Net, specifically tailored for capturing and processing video streams. This application showcases various features such as video capture from DV camcorders, video effect manipulation, and outputs like MP4, AVI, WMV, GIF, and more. It includes direct control over video playback and recording features such as play, pause, stop, rewind, and fast forward. Additionally, the application demonstrates handling of audio and video inputs/outputs, including volume and balance adjustments, selection of video input devices and formats, and real-time video effects such as grayscale, saturation, flip, and deinterlace filters. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\IP MJPEG Restreamer\readme.es.md # VisioForge Video Capture SDK .Net ## IP MJPEG Restreamer Demo (C#/WPF) La muestra proporciona una solución completa para integrar el streaming de cámaras IP y la compatibilidad con ONVIF en aplicaciones .NET. Cuenta con una interfaz de ventana personalizable para visualizar secuencias de vídeo en directo, es compatible con varios tipos de fuentes de cámaras IP (RTSP, RTMP, HTTP, UDP, HLS) y motores de descodificación (incluidos VLC y FFMPEG), y ofrece funciones de transmisión en red en formato MJPEG. Además, el SDK habilita el control de cámaras ONVIF, lo que permite el descubrimiento de cámaras, la selección de perfiles y la gestión de conexiones. Con funciones como la captura de vídeo, el registro de errores y la supervisión de la fuente de red, los desarrolladores pueden implementar fácilmente funcionalidades avanzadas de control y transmisión de vídeo en sus aplicaciones. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\IP MJPEG Restreamer\readme.md # VisioForge Video Capture SDK .Net ## IP MJPEG Restreamer Demo (C#/WPF) The sample provides a comprehensive solution for integrating IP camera streaming and ONVIF support into .NET applications. It features a customizable window interface for viewing live video feeds, supports various IP camera source types (RTSP, RTMP, HTTP, UDP, HLS) and decoding engines (including VLC and FFMPEG), and offers network streaming capabilities in MJPEG format. Additionally, the SDK enables ONVIF camera control, allowing for camera discovery, profile selection, and connection management. With features like video capture, error logging, and network source monitoring, developers can easily implement advanced video streaming and control functionalities in their applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\IP_Capture\readme.es.md # VisioForge Video Capture SDK .Net ## IP Capture Demo (C#/WPF) La muestra proporciona una solución completa para integrar el streaming de cámaras IP y la compatibilidad con ONVIF en aplicaciones .NET. Cuenta con una interfaz de ventana personalizable para visualizar secuencias de vídeo en directo, es compatible con varios tipos de fuentes de cámaras IP (RTSP, RTMP, HTTP, UDP, HLS) y motores de descodificación (incluidos VLC y FFMPEG), y ofrece funciones de transmisión en red en formato MJPEG. Además, el SDK habilita el control de cámaras ONVIF, lo que permite el descubrimiento de cámaras, la selección de perfiles y la gestión de conexiones. Con funciones como la captura de vídeo, el registro de errores y la supervisión de la fuente de red, los desarrolladores pueden implementar fácilmente funcionalidades avanzadas de control y transmisión de vídeo en sus aplicaciones. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\IP_Capture\readme.md # VisioForge Video Capture SDK .Net ## IP Capture Demo (C#/WPF) The provided code showcases an advanced .NET application for IP camera video capturing and processing, utilizing the VisioForge Video Capture SDK .Net. It features a comprehensive GUI for configuring video capture settings, including output format selection (e.g., MP4, AVI, GIF), hardware encoder configurations, and video effect adjustments. The application supports ONVIF protocol for IP cameras, enabling functionalities like PTZ (Pan, Tilt, Zoom) controls. It also offers features for taking screenshots, recording settings adjustments, and applying video effects such as grayscale, contrast, and saturation. The code demonstrates asynchronous task management for initializing the video capture engine and handling events like errors and network source disconnections, ensuring a responsive user experience. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Main_Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Main Demo (C#/WPF) La demo muestra las características más importantes de Video Capture SDK .Net. Usted puede: * previsualizar o capturar video desde webcams, cámaras IP, pantallas, dispositivos Decklink, y algunas otras fuentes * Aplicar efectos de vídeo y audio * Realizar streaming en red * Guardar vídeo y audio en varios formatos de salida * Aplicar OSD * Utilizar Picture-in-Picture * Detectar movimiento * reconocer códigos de barras * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Main_Demo\readme.md # VisioForge Video Capture SDK .Net ## Main Demo (C#/WPF) The demo shows the most important features of Video Capture SDK .Net. You can: * preview or capture video from webcams, IP cameras, screens, Decklink devices, and some other sources * apply video and audio effects * perform network streaming * save video and audio to various output formats * apply OSD * use Picture-in-Picture * detect motion * recognize barcodes * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Multiple IP Cameras Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Multiple IP Cameras Demo (C#/WPF) La muestra "Multiple IP Cameras Demo WPF" demuestra cómo integrar múltiples alimentaciones de cámaras IP en una sola aplicación WPF usando el SDK de Captura de Video VisioForge .Net. Cuenta con la inicialización asíncrona de un motor de captura de vídeo, lo que permite la captura y previsualización de secuencias de vídeo de hasta cuatro cámaras IP simultáneamente. Los usuarios pueden seleccionar diferentes motores de origen (por ejemplo, VLC, FFMPEG, RTSP de baja latencia) para cada alimentación de cámara, ajustar la compatibilidad con ONVIF y alternar entre los modos de vista previa o grabación. La aplicación incluye la gestión de errores para registrar los problemas durante la captura y proporciona un mecanismo de limpieza para deshacerse de los recursos al cerrar, mostrando las mejores prácticas para la gestión de recursos en aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Multiple IP Cameras Demo\readme.md # VisioForge Video Capture SDK .Net ## Multiple IP Cameras Demo (C#/WPF) The "Multiple IP Cameras Demo WPF" sample demonstrates how to integrate multiple IP camera feeds into a single WPF application using the VisioForge Video Capture SDK .Net. It features asynchronous initialization of a video capture engine, allowing for the capture and preview of video streams from up to four IP cameras simultaneously. Users can select different source engines (e.g., VLC, FFMPEG, RTSP Low Latency) for each camera feed, adjust ONVIF support, and toggle preview or recording modes. The application includes error handling to log issues during capture and provides a clean-up mechanism to dispose of resources upon closing, showcasing best practices for resource management in .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\MultipleWebCameras\readme.es.md # VisioForge Video Capture SDK .Net ## Multiple Web Cameras Demo (C#/WPF) El ejemplo demuestra cómo integrar y controlar múltiples cámaras web dentro de una única aplicación utilizando el SDK VisioForge Video Capture .Net. Este ejemplo muestra la inicialización y eliminación de dos instancias de captura de vídeo, `VideoCapture1` y `VideoCapture2`, dentro de una aplicación WPF. Incluye métodos de inicio y parada asíncronos para cada cámara, gestión de errores mediante suscripción a eventos e interacción con la interfaz de usuario para seleccionar dispositivos de cámara y mostrar previsualizaciones de vídeo. El código proporciona un ejemplo práctico de gestión de múltiples fuentes de captura de vídeo, ajuste de la configuración de captura de vídeo para un rendimiento óptimo y gestión de posibles errores durante las operaciones de captura de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\MultipleWebCameras\readme.md # VisioForge Video Capture SDK .Net ## Multiple Web Cameras Demo (C#/WPF) The sample demonstrates how to integrate and control multiple web cameras within a single application using the VisioForge Video Capture SDK .Net. This example showcases the initialization and disposal of two video capture instances, `VideoCapture1` and `VideoCapture2`, within a WPF application. It includes asynchronous start and stop methods for each camera, error handling through event subscription, and UI interaction to select camera devices and display video previews. The code provides a practical example of managing multiple video capture sources, adjusting video capture settings for optimal performance, and handling possible errors during video capture operations. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Nvidia Maxine Demo\readme.es.md # VisioForge Video Capture SDK .Net ## Nvidia Maxine Demo (C#/WPF) Este ejemplo demuestra la integración y utilización de los efectos de vídeo Nvidia Maxine dentro de una aplicación .NET utilizando Video Capture SDK .Net, centrándose en la mejora de las capacidades de captura de vídeo. El ejemplo, estructurado en torno a una aplicación WPF, muestra la configuración y gestión de los dispositivos de captura de vídeo, las configuraciones de entrada y salida de audio y la aplicación de varios efectos de vídeo Nvidia Maxine, como la eliminación de ruido, la reducción de artefactos, el aumento de escala y la superresolución. Además, gestiona la inicialización asíncrona del motor de captura de vídeo, el registro de errores basado en eventos y las actualizaciones de la interfaz de usuario en función de los resultados del procesamiento de vídeo. El ejemplo está diseñado para proporcionar a los desarrolladores una comprensión completa de la implementación de funciones avanzadas de procesamiento de vídeo en sus aplicaciones. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Nvidia Maxine Demo\readme.md # VisioForge Video Capture SDK .Net ## Nvidia Maxine Demo (C#/WPF) This sample demonstrates the integration and utilization of the Nvidia Maxine video effects within a .NET application using Video Capture SDK .Net, focusing on enhancing video capture capabilities. The sample, structured around a WPF application, showcases the setup and management of video capture devices, audio input and output configurations, and the application of various Nvidia Maxine video effects such as denoising, artifact reduction, upscaling, and super resolution. Additionally, it handles asynchronous initialization of the video capture engine, event-driven error logging, and UI updates based on video processing results. The sample is designed to provide developers with a comprehensive understanding of implementing advanced video processing features in their applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Screen_Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Screen Capture Demo (C#/WPF) El fragmento de código proporcionado muestra una implementación completa para una utilidad de captura de pantalla utilizando el VisioForge Video Capture SDK .Net. Demuestra cómo inicializar y configurar el entorno de captura de vídeo, incluida la configuración de varios formatos de salida (p. ej., AVI, WMV, MP4, GIF), la captura de audio junto con un vídeo y la aplicación de efectos de vídeo (p. ej., escala de grises, ajuste de saturación). La aplicación permite capturar toda la pantalla, ventanas específicas o regiones, con opciones para incluir el cursor del ratón y resaltarlo. También incluye cuadros de diálogo para configurar los ajustes de salida para distintos formatos, guardar capturas de pantalla y gestionar efectos de vídeo como logotipos o superposiciones de texto. Este ejemplo está diseñado para desarrolladores que buscan integrar capacidades avanzadas de captura de pantalla en sus aplicaciones, ofreciendo un rico conjunto de características para controlar y personalizar el proceso de captura de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Screen_Capture\readme.md # VisioForge Video Capture SDK .Net ## Screen Capture Demo (C#/WPF) The provided code snippet showcases a comprehensive implementation for a screen capture utility using the VisioForge Video Capture SDK .Net. It demonstrates how to initialize and configure the video capture environment, including setting up various output formats (e.g., AVI, WMV, MP4, GIF), capturing audio alongside a video, and applying video effects (e.g., grayscale, saturation adjustment). The app allows for capturing the entire screen, specific windows, or regions, with options to include a mouse cursor and highlight it. It also features dialogs for configuring output settings for different formats, saving screenshots, and managing video effects like logos or text overlays. This example is designed for developers looking to integrate advanced screen capture capabilities into their applications, offering a rich set of features to control and customize the video capture process. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Simple Video Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WPF) La muestra demuestra un enfoque integral para la grabación y procesamiento de vídeo dentro de una aplicación WPF. Utiliza el VisioForge Video Capture SDK .Net y la biblioteca de procesamiento para manejar varios aspectos de la captura de vídeo, incluyendo la selección del dispositivo, la configuración del formato y la aplicación de efectos de vídeo. La muestra muestra funciones como el ajuste de la configuración de audio y vídeo, la captura de capturas de pantalla, la aplicación de filtros como la escala de grises, el ajuste del contraste y la superposición de texto o imágenes. Además, ofrece soporte para diferentes formatos de salida como AVI, WMV, MP4 y GIF, entre otros. El código está estructurado para ofrecer un enfoque modular a la integración de las capacidades de captura de vídeo, haciendo hincapié en la facilidad de personalización y extensión para los desarrolladores. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Simple Video Capture\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WPF) The sample demonstrates a comprehensive approach to video recording and processing within a WPF application. It utilizes the VisioForge Video Capture SDK .Net and processing library to handle various aspects of video capture, including device selection, format settings, and applying video effects. The sample showcases functionality such as adjusting audio and video settings, capturing screenshots, applying filters like greyscale, contrast adjustment, and text or image overlay. Additionally, it provides support for different output formats such as AVI, WMV, MP4, and GIF, among others. The code is structured to offer a modular approach to integrating video capture capabilities, emphasizing ease of customization and extension for developers. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Simple Video Capture MVVM\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WPF MVVM) El código proporcionado esboza una muestra para una aplicación sencilla de captura de vídeo utilizando el VisioForge Video Capture SDK .Net. Demuestra la configuración y el uso de dispositivos de captura de vídeo y audio, la configuración de las propiedades del dispositivo, las opciones de grabación y la aplicación de efectos de vídeo como ajustes de escala de grises, inversión, volteo, contraste, luminosidad y saturación. Además, incluye funciones para añadir y editar logotipos de imagen y texto, gestionar formatos y archivos de salida y manejar estados de previsualización y grabación con actualizaciones en tiempo real de la duración de la grabación. La muestra está estructurada dentro de una clase `MainWindowViewModel`, aprovechando el patrón MVVM de Prism para propiedades y comandos vinculables, mostrando un enfoque integral para construir una interfaz de captura de vídeo en una aplicación .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Simple Video Capture MVVM\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WPF MVVM) The provided code outlines a sample for a simple video capture application using the VisioForge Video Capture SDK .Net. It demonstrates the setup and use of video and audio capture devices, configuring device properties, recording options, and applying video effects such as grayscale, invert, flip, contrast, lightness, and saturation adjustments. Additionally, it includes functionality for adding and editing image and text logos, managing output formats and files, and handling preview and recording states with real-time updates on recording duration. The sample is structured within a `MainWindowViewModel` class, leveraging Prism's MVVM pattern for bindable properties and commands, showcasing a comprehensive approach to building a video capture interface in a .NET application. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Skinned Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Skinned Video Capture Demo (C#/WPF) El ejemplo del SDK muestra cómo implementar una aplicación de captura de vídeo con skins utilizando VisioForge Video Capture SDK .Net en un entorno WPF. La aplicación muestra la integración de skins personalizados para la interfaz de usuario, el manejo de comandos básicos de ventana (minimizar, maximizar, restaurar, cerrar), y las funcionalidades de captura de vídeo, incluyendo el cambio entre cámaras, alternando el modo de pantalla completa, y la aplicación de efectos de vídeo. El código también hace hincapié en el manejo de eventos para errores de captura de vídeo y la implementación de un patrón `Dispose` para la gestión de recursos. Este ejemplo es una guía completa para los desarrolladores que deseen incorporar funciones de captura de vídeo con una interfaz de usuario personalizable en sus aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\WPF\CSharp\Skinned Capture\readme.md # VisioForge Video Capture SDK .Net ## Skinned Video Capture Demo (C#/WPF) The SDK sample demonstrates how to implement a skinned video capture application using VisioForge Video Capture SDK .Net in a WPF environment. The application showcases the integration of custom skins for the user interface, handling of basic window commands (minimize, maximize, restore, close), and video capture functionalities, including switching between cameras, toggling fullscreen mode, and applying video effects. The code also emphasizes event handling for video capture errors and the implementation of a `Dispose` pattern for resource management. This example is a comprehensive guide for developers looking to incorporate video capture capabilities with a customizable UI in their .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\face-detection\readme.es.md # VisioForge Video Capture SDK .Net - Face detection code snippet (C#/WinForms) El fragmento de código proporcionado es un ejemplo de integración de la funcionalidad de detección facial en una aplicación Windows Forms utilizando el SDK .Net de VisioForge Video Capture. Demuestra la configuración de un entorno de captura de vídeo en el que se selecciona una fuente de vídeo, se configuran los formatos de entrada de vídeo y las frecuencias de cuadro, y se aplican los ajustes de detección de caras. La aplicación permite la detección de rostros en tiempo real dentro del flujo de vídeo, mostrando las coordenadas del rostro detectado en la interfaz. Además, el código gestiona eventos como errores y resultados de la detección de caras y proporciona controles para iniciar, detener, pausar y reanudar la captura de vídeo, junto con opciones de depuración para los desarrolladores. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\face-detection\readme.md # VisioForge Video Capture SDK .Net - Face detection code snippet (C#/WinForms) The provided code snippet is an example of integrating face detection functionality into a Windows Forms application using the VisioForge Video Capture SDK .Net. It demonstrates the setup of a video capture environment where a video source is selected, video input formats and frame rates are configured, and face detection settings are applied. The application allows for real-time face detection within the video stream, displaying detected face coordinates in the interface. Additionally, the code handles events such as errors and face detection results and provides controls for starting, stopping, pausing, and resuming video capture, along with debugging options for developers. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\ip-camera-capture-mp4\readme.es.md # Video Capture SDK .Net - IP camera capture to MP4 code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo capturar secuencias de vídeo de cámaras IP y guardarlas como archivos MP4 utilizando el SDK de captura de vídeo VisioForge .Net. El código de ejemplo, escrito en C#, configura una sencilla aplicación Windows Forms para conectarse a una cámara IP utilizando su URL, configurar la captura de vídeo sin audio y grabar el flujo de vídeo en la carpeta "Mis vídeos" del usuario. Incluye funcionalidad para iniciar y detener la captura de vídeo de forma asíncrona, mostrando el uso de la API VisioForge para tareas de captura de vídeo, incluyendo la configuración de la fuente de vídeo a una cámara IP, especificando el formato de salida, y el control de la sesión de captura con métodos de inicio y parada. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\ip-camera-capture-mp4\readme.md # Video Capture SDK .Net - IP camera capture to MP4 code snippet (C#/WinForms) This SDK sample demonstrates how to capture video streams from IP cameras and save them as MP4 files using the VisioForge Video Capture SDK .Net. The example code, written in C#, sets up a simple Windows Forms application to connect to an IP camera using its URL, configure video without audio capture, and record the video stream to the user's "My Videos" folder. It includes functionality to start and stop video capture asynchronously, showcasing the use of the VisioForge API for video capture tasks, including setting the video source to an IP camera, specifying the output format, and controlling the capture session with start and stop methods. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\ip-camera-preview\readme.es.md # Video Capture SDK .Net - IP camera preview code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo crear una sencilla aplicación Windows Forms para previsualizar imágenes de cámaras IP utilizando el SDK VisioForge Video Capture .Net. La aplicación inicializa un componente de captura de vídeo al cargar el formulario y lo configura para conectarse a una cámara IP utilizando una URL especificada. Soporta iniciar y detener la vista previa de la cámara de forma asíncrona con botones. El ejemplo muestra el uso de la API de VisioForge para manejar fuentes de cámaras IP, incluyendo la configuración de la URL de la cámara y la selección del motor de decodificación para un rendimiento óptimo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\ip-camera-preview\readme.md # Video Capture SDK .Net - IP camera preview code snippet (C#/WinForms) This SDK sample demonstrates how to create a simple Windows Forms application for previewing IP camera feeds using the VisioForge Video Capture SDK .Net. The application initializes a video capture component on form load and configures it to connect to an IP camera using a specified URL. It supports starting and stopping the camera preview asynchronously with buttons. The sample showcases the use of the VisioForge API for handling IP camera sources, including setting up the camera's URL and selecting the decoding engine for optimal performance. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\screen-capture\readme.es.md # Video Capture SDK .Net - Screen capture code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación de captura de pantalla utilizando el VisioForge Video Capture SDK .Net. La muestra incluye la inicialización del motor de captura de vídeo, la configuración de los ajustes de captura de pantalla (como la selección del índice de visualización y la configuración de la velocidad de fotogramas), la gestión de la captura de audio y la gestión del formato de salida. Los usuarios pueden iniciar, pausar, reanudar y detener el proceso de captura. Además, muestra el manejo de errores y opciones de depuración, proporcionando un ejemplo completo para los desarrolladores interesados en la construcción de funciones de grabación de pantalla en sus aplicaciones. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\screen-capture\readme.md # Video Capture SDK .Net - Screen capture code snippet (C#/WinForms) This SDK sample demonstrates how to create a screen capture application using the VisioForge Video Capture SDK .Net. The sample includes initializing the video capture engine, configuring screen capture settings (such as selecting the display index and setting the frame rate), handling audio capture, and managing the output format. Users can start, pause, resume, and stop the capture process. Additionally, it showcases error handling and debugging options, providing a comprehensive example for developers interested in building screen recording features into their applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\screen-capture-avi\readme.es.md # Video Capture SDK .Net - Screen capture to AVI code snippet (C#/WinForms) Este ejemplo de SDK muestra cómo implementar una sencilla aplicación de grabación de pantalla utilizando el SDK de Captura de Vídeo .Net de VisioForge en una aplicación Windows Forms. El código inicializa un objeto `VideoCaptureCore` para capturar toda la pantalla y guardar la grabación como un archivo AVI en la carpeta "Mis vídeos" del usuario. Configura la captura de vídeo utilizando el códec MJPEG para vídeo y PCM para audio en el formato de salida AVI. La aplicación proporciona controles básicos para iniciar y detener el proceso de captura de pantalla de forma asíncrona, mostrando la facilidad de integración de la funcionalidad de grabación de pantalla en aplicaciones .NET con el SDK de VisioForge. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\screen-capture-avi\readme.md # Video Capture SDK .Net - Screen capture to AVI code snippet (C#/WinForms) This SDK sample demonstrates how to implement a simple screen recording application using VisioForge's Video Capture SDK .Net in a Windows Forms application. The code initializes a `VideoCaptureCore` object to capture the entire screen and save the recording as an AVI file in the user's "My Videos" folder. It configures the video capture using the MJPEG codec for video, and PCM for audio in the AVI output format. The application provides basic controls to start and stop the screen capture process asynchronously, showcasing the ease of integrating screen recording functionality into .NET applications with VisioForge's SDK. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\screen-capture-mp4\readme.es.md # Video Capture SDK .Net - Screen capture to MP4 code snippet (C#/WinForms) Este ejemplo de SDK muestra cómo utilizar VisioForge Video Capture SDK .Net para crear una sencilla aplicación de grabación de pantalla en C#. La aplicación presenta dos funcionalidades principales: capturar la pantalla con audio y capturar la pantalla sin audio, ambas con salida del resultado como un archivo MP4. Inicializa el componente de captura de vídeo, configura los ajustes de captura de pantalla para la grabación a pantalla completa, selecciona un dispositivo de audio para la grabación (si es necesario), y controla el inicio y la parada del proceso de grabación de forma asíncrona. El ejemplo muestra la facilidad de configuración de la captura de pantalla, la captura de audio y las configuraciones del archivo de salida utilizando el SDK de VisioForge. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\screen-capture-mp4\readme.md # Video Capture SDK .Net - Screen capture to MP4 code snippet (C#/WinForms) This SDK sample demonstrates how to use the VisioForge Video Capture SDK .Net to create a simple screen recording application in C#. The application features two main functionalities: capturing the screen with audio and capturing the screen without audio, both outputting the result as an MP4 file. It initializes the video capture component, configures screen capture settings for full-screen recording, selects an audio device for recording (if needed), and controls the start and stop of the recording process asynchronously. The example showcases the ease of setting up screen capture, audio capture, and output file configurations using the VisioForge SDK. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\screen-capture-wmv\readme.es.md # Video Capture SDK .Net - Screen capture to WMV code snippet (C#/WinForms) Este ejemplo de SDK muestra cómo crear una sencilla aplicación Windows Forms para la captura de pantalla con el SDK de captura de vídeo de VisioForge. Los usuarios pueden iniciar y detener el proceso de captura de pantalla mediante botones, y el resultado se guarda en formato WMV en la carpeta "Mis vídeos" del usuario. El ejemplo destaca la facilidad de integrar la funcionalidad de captura de pantalla en aplicaciones .NET utilizando las completas librerías de VisioForge para la grabación y procesamiento de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\screen-capture-wmv\readme.md # Video Capture SDK .Net - Screen capture to WMV code snippet (C#/WinForms) This SDK sample demonstrates how to create a simple Windows Forms application for screen capturing with VisioForge's Video Capture SDK. Users can start and stop the screen capture process through buttons, with the output being saved in the WMV format to the user's "My Videos" folder. The example highlights the ease of integrating screen capture functionality into .NET applications using VisioForge's comprehensive libraries for video recording and processing. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\speaker-capture\readme.es.md # Video Capture SDK .Net - Speaker capture code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo implementar la funcionalidad de captura de audio en una aplicación .NET Windows Forms utilizando el SDK VisioForge Video Capture .Net. El ejemplo muestra cómo configurar la captura de audio desde un dispositivo especificado, configurar el formato de salida y la ruta del archivo, y gestionar el proceso de captura con controles de inicio, parada, pausa y reanudación. Incluye controladores de eventos para registrar errores y actualizar la interfaz de usuario con la marca de tiempo de captura actual, lo que ilustra un enfoque práctico para integrar funciones de captura de audio en aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\speaker-capture\readme.md # Video Capture SDK .Net - Speaker capture code snippet (C#/WinForms) This SDK sample demonstrates how to implement audio capture functionality in a .NET Windows Forms application using the VisioForge Video Capture SDK .Net. The example showcases how to set up audio capture from a specified device, configure output format and file path, and manage the capture process with start, stop, pause, and resume controls. It includes event handlers to log errors and update the UI with the current capture timestamp, illustrating a practical approach to integrating audio capture capabilities into .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-capture-text-overlay\readme.es.md # Video Capture SDK .Net - Video capture with text overlay code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo crear una sencilla aplicación de captura de vídeo utilizando el SDK de captura de vídeo VisioForge en C#. Muestra la inicialización de dispositivos de captura de vídeo y audio, la configuración del modo de captura y el formato de salida, y la adición de un efecto de superposición de texto a la secuencia de vídeo. El ejemplo incluye métodos para iniciar y detener la captura de vídeo de forma asíncrona. La función de superposición de texto permite mostrar texto personalizado ("¡Hola Mundo!") en el vídeo capturado, con la posición y los ajustes de color especificados. Este ejemplo está diseñado para formar parte de una aplicación Windows Forms, ilustrando la integración de las funcionalidades de captura y procesamiento de vídeo dentro de un entorno GUI. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-capture-text-overlay\readme.md # Video Capture SDK .Net - Video capture with text overlay code snippet (C#/WinForms) This SDK sample demonstrates how to create a simple video capture application using the VisioForge Video Capture SDK in C#. It showcases initializing video and audio capture devices, setting the capture mode and output format, and adding a text overlay effect to the video stream. The sample includes methods for starting and stopping the video capture asynchronously. The text overlay feature allows displaying custom text ("Hello World!") on the captured video, with specified position and color settings. This example is designed to be part of a Windows Forms application, illustrating the integration of video capture and processing functionalities within a GUI environment. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-capture-webcam-avi\readme.es.md # Video Capture SDK .Net - Video capture to AVI code snippet (C#/WinForms) Este ejemplo de SDK muestra la implementación de una aplicación básica de captura de vídeo con webcam utilizando VisioForge Video Capture SDK .Net en una aplicación Windows Forms. El código inicializa el objeto `VideoCaptureCore`, configura las fuentes de vídeo y audio predeterminadas y especifica el formato y la ubicación del archivo de salida. Los usuarios pueden iniciar y detener la captura de vídeo con sólo pulsar un botón, capturar vídeo de la webcam y guardarlo como un archivo AVI con compresión de vídeo MJPEG y audio PCM. Este ejemplo muestra cómo utilizar la librería VisioForge para capturar vídeo desde dispositivos hardware, configurar los ajustes de captura y gestionar el proceso de captura de forma asíncrona. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-capture-webcam-avi\readme.md # Video Capture SDK .Net - Video capture to AVI code snippet (C#/WinForms) This SDK sample demonstrates the implementation of a basic webcam video capture application using VisioForge Video Capture SDK .Net in a Windows Forms application. The code initializes the `VideoCaptureCore` object, sets up the default video and audio sources, and specifies the output file format and location. Users can start and stop video capture with the click of a button, capturing video from the webcam and saving it as an AVI file with MJPEG video compression and PCM audio. This example showcases how to use the VisioForge library to capture video from hardware devices, configure capture settings, and manage the capture process asynchronously. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-capture-webcam-mp4\readme.es.md # Video Capture SDK .Net - Video capture to MP4 code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo implementar una aplicación básica de captura de vídeo utilizando el SDK VisioForge Video Capture SDK .Net en C#. La aplicación captura vídeo y audio de los dispositivos predeterminados y guarda la grabación como un archivo MP4. El ejemplo incluye la inicialización del núcleo de captura de vídeo, la configuración de las fuentes de vídeo y audio, la configuración del formato de salida y el inicio y la detención del proceso de captura con métodos asíncronos. Muestra la facilidad de integración de las capacidades de VisioForge en una aplicación Windows Forms para tareas de captura de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-capture-webcam-mp4\readme.md # Video Capture SDK .Net - Video capture to MP4 code snippet (C#/WinForms) This SDK sample demonstrates how to implement a basic video capture application using the VisioForge Video Capture SDK .Net in C#. The application captures video and audio from the default devices and saves the recording as an MP4 file. The sample includes initializing the video capture core, setting the video and audio sources, configuring the output format, and starting and stopping the capture process with asynchronous methods. It showcases the ease of integrating VisioForge's capabilities into a Windows Forms application for video capture tasks. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-capture-webcam-wmv\readme.es.md # Video Capture SDK .Net - Video capture to WMV code snippet (C#/WinForms) Este ejemplo de SDK muestra cómo utilizar VisioForge Video Capture SDK .Net para crear una sencilla aplicación Windows Forms para capturar vídeo de una cámara web y guardarlo como un archivo WMV. El código inicializa los dispositivos de captura de vídeo y audio con su configuración predeterminada, establece la ruta del archivo de salida y especifica WMV como formato de salida. Presenta métodos de inicio y parada asíncronos para la captura de vídeo, mostrando cómo integrar la funcionalidad de captura de vídeo en aplicaciones .NET con facilidad. El ejemplo es ideal para desarrolladores que deseen implementar funciones de captura y procesamiento de vídeo de cámaras web en sus proyectos de software. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-capture-webcam-wmv\readme.md # Video Capture SDK .Net - Video capture to WMV code snippet (C#/WinForms) This SDK sample demonstrates how to use the VisioForge Video Capture SDK .Net to create a simple Windows Forms application for capturing video from a webcam and saving it as a WMV file. The code initializes the video and audio capture devices with their default settings, sets the output file path, and specifies WMV as the output format. It features asynchronous start and stop methods for video capture, showcasing how to integrate video capture functionality into .NET applications with ease. The sample is ideal for developers looking to implement webcam video capture and processing features in their software projects. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-preview-webcam-frame-capture\readme.es.md # Video Capture SDK .Net - Video preview from a webcam with a frame capture code snippet (C#/WinForms) Este ejemplo del SDK muestra cómo crear una aplicación básica de Windows Forms que pueda previsualizar un vídeo desde una webcam, capturar fotogramas y guardarlos como imágenes JPEG. Proporciona funcionalidad para iniciar y detener la previsualización de vídeo desde un dispositivo webcam seleccionado y guardar el fotograma actual en la carpeta "Mis imágenes" del usuario. La muestra es un ejemplo sencillo de integración de la captura de vídeo con webcam y la extracción de fotogramas en aplicaciones .NET utilizando el kit de herramientas VisioForge. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\video-preview-webcam-frame-capture\readme.md # Video Capture SDK .Net - Video preview from a webcam with a frame capture code snippet (C#/WinForms) This SDK sample demonstrates how to create a basic Windows Forms application that can preview a video from a webcam, capture frames, and save them as JPEG images. It provides functionality to start and stop video preview from a selected webcam device and save the current frame to the user's "My Pictures" folder. The sample is a straightforward example of integrating webcam video capture and frame extraction into .NET applications using the VisioForge toolkit. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\webcam-preview\readme.es.md # Video Capture SDK .Net - Video preview from a webcam code snippet (C#/WinForms) Este ejemplo muestra cómo construir una aplicación básica de previsualización de webcam utilizando el SDK VisioForge Video Capture SDK .Net en una aplicación Windows Forms. El código muestra cómo inicializar el motor de captura de vídeo, enumerar los dispositivos de captura de vídeo y audio disponibles y configurarlos para la vista previa. Incluye el manejo de eventos para iniciar, detener, pausar y reanudar el flujo de vídeo, así como ajustar dinámicamente la configuración de vídeo y audio en función de la selección del usuario. Este ejemplo sirve como guía básica para los desarrolladores que deseen integrar la funcionalidad de la cámara web en sus aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK\_CodeSnippets\webcam-preview\readme.md # Video Capture SDK .Net - Video preview from a webcam code snippet (C#/WinForms) This sample demonstrates how to build a basic webcam preview application using the VisioForge Video Capture SDK .Net in a Windows Forms application. The code showcases how to initialize the video capture engine, enumerate available video and audio capture devices, and configure them for preview. It includes event handling for starting, stopping, pausing, and resuming the video stream, as well as dynamically adjusting video and audio settings based on user selection. This example serves as a foundational guide for developers looking to integrate webcam functionality into their .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\Android\Simple Video Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (Android) Este ejemplo del SDK muestra una sencilla aplicación Android para la captura de vídeo utilizando el SDK VisioForge Video Capture .Net. La clase `MainActivity`, que sirve como actividad principal, inicializa el entorno de captura de vídeo, gestiona la enumeración de la cámara y el dispositivo de audio, y permite iniciar y detener la grabación de vídeo. Incluye elementos de interfaz de usuario para el control de la grabación, como botones de inicio y parada, y un botón de cambio de cámara para alternar entre las fuentes de vídeo disponibles. La aplicación también gestiona los permisos de acceso a la cámara, grabación de audio y almacenamiento de archivos, garantizando una configuración del usuario sin problemas antes de capturar los vídeos. Este ejemplo muestra la integración de las potentes capacidades de captura de vídeo de VisioForge en una aplicación Android, permitiendo a los desarrolladores implementar funciones de grabación de vídeo personalizadas con facilidad. ## Versiones de .Net compatibles * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\Android\Simple Video Capture\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (Android) This SDK sample demonstrates a simple Android application for video capturing using the VisioForge Video Capture SDK .Net. The `MainActivity` class, which serves as the main activity, initializes the video capture environment, handles camera and audio device enumeration, and supports starting and stopping video recording. It includes user interface elements for recording control, such as start and stop buttons, and a switch camera button to toggle between available video sources. The application also manages permissions for camera access, audio recording, and file storage, ensuring a seamless user setup before capturing videos. This example showcases the integration of VisioForge's powerful video capture capabilities into an Android application, enabling developers to implement custom video recording features with ease. ## Supported frameworks * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\Avalonia\Simple Video Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo Avalonia (C#/AvaloniaUI) Este ejemplo de SDK demuestra la integración del SDK de Captura de Vídeo VisioForge .Net con una aplicación GUI basada en Avalonia para capturar secuencias de vídeo y audio. Muestra la instalación y configuración de dispositivos de entrada de vídeo y audio, la selección de formatos de entrada y velocidades de fotogramas, y la gestión de eventos de dispositivos. La aplicación también incluye efectos de vídeo en tiempo real, ajuste del volumen de audio, controles de grabación (inicio, pausa, reanudación, parada) y función de instantáneas. Aprovecha las capacidades del SDK de VisioForge para la captura, el procesamiento y la renderización de vídeo dentro de un marco de interfaz de usuario Avalonia multiplataforma, proporcionando un ejemplo completo para los desarrolladores que deseen implementar funcionalidades de captura y procesamiento de medios en sus aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\Avalonia\Simple Video Capture\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo Avalonia (C#/AvaloniaUI) This SDK sample demonstrates the integration of the VisioForge Video Capture SDK .Net with an Avalonia-based GUI application for capturing video and audio streams. It showcases the setup and configuration of video and audio input devices, the selection of input formats and frame rates, and the management of device events. The application also features real-time video effects, audio volume adjustment, recording controls (start, pause, resume, stop), and snapshot functionality. It leverages the VisioForge SDK's capabilities for video capture, processing, and rendering within a cross-platform Avalonia UI framework, providing a comprehensive example for developers looking to implement media capture and processing functionalities in their .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\iOS\SimpleVideoCapture\readme.es.md # Media Blocks SDK .Net - iOS Simple Video Capture Demo Este ejemplo muestra la implementación de una sencilla aplicación de captura y procesamiento de vídeo utilizando el SDK VisioForge Media Blocks. Muestra cómo enumerar fuentes de vídeo, capturar vídeo y audio de una cámara y un micrófono seleccionados, aplicar efectos de vídeo como escala de grises, renderizar el vídeo en pantalla y, opcionalmente, codificar y guardar el vídeo en un archivo. El código también incluye funciones para cambiar de cámara, detener la captura y guardar el vídeo capturado en la fototeca de iOS. Se utilizan funciones avanzadas como capturadores de muestras de audio y vídeo para procesar fotogramas, y se añaden elementos de interfaz de usuario personalizados para controlar el proceso de captura. La aplicación aprovecha la arquitectura MediaBlocks del SDK VisioForge para el procesamiento modular de medios, mostrando un ejemplo práctico de captura y manipulación de vídeo en tiempo real en dispositivos iOS. ## Características - Vista previa del vídeo de la cámara - Captura de vídeo y audio a un archivo MP4 - Añadir efectos de vídeo de muestra - Cambio entre cámaras - Añadir grabadores de muestra para audio y vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\iOS\SimpleVideoCapture\readme.md # Media Blocks SDK .Net - iOS Simple Video Capture Demo This sample demonstrates the implementation of a simple video capture and processing application using the VisioForge Media Blocks SDK. It showcases how to enumerate video sources, capture video and audio from a selected camera and microphone, apply video effects like grayscale, render the video on-screen, and optionally encode and save the video to a file. The code also includes functionality for switching between cameras, stopping the capture, and saving the captured video to the iOS photo library. Advanced features such as audio and video sample grabbers are utilized to process frames, and custom UI elements are added to control the capture process. The application leverages the VisioForge SDK's MediaBlocks architecture for modular media processing, demonstrating a practical example of real-time video capture and manipulation on iOS devices. ## Features - Preview camera video - Capture video and audio to MP4 file - Add sample video effects - Switch between cameras - Add sample grabbers for audio and video ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\macOS\ScreenCapture\readme.es.md # Media Blocks SDK .Net - Screen Capture Demo (macOS) Esta muestra de SDK ilustra cómo crear una aplicación de captura y grabación de pantalla utilizando Media Blocks SDK .Net dentro de un marco de trabajo WPF. La aplicación demuestra cómo configurar un pipeline de bloques de medios para capturar contenido de pantalla y audio del sistema, mostrándolos y codificándolos en un archivo. Destaca la integración de bloques de fuente de pantalla y audio, bloques de renderizado de video y audio, y bloques de codificación para video H264 y audio AAC, culminando en el guardado del resultado final como un archivo MP4. Además, la muestra incluye opciones para seleccionar dispositivos de entrada y salida de audio mediante enumeración de dispositivos, incorpora mecanismos de manejo de errores y proporciona la capacidad de alternar entre modos de vista previa y grabación. ## Características - Captura de vídeo de la pantalla a un archivo MP4 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\macOS\ScreenCapture\readme.md # Media Blocks SDK .Net - Screen Capture Demo (macOS) This SDK sample illustrates how to create a screen capture and recording application using the VisioForge Media Blocks SDK .Net within a WPF framework. The application demonstrates how to configure a media block pipeline to capture screen content and system audio, displaying and encoding them into a file. It features the integration of screen and audio source blocks, video and audio renderer blocks, and encoding blocks for H264 video and AAC audio, culminating in the saving of the final output as an MP4 file. Furthermore, the sample includes options for selecting audio input and output devices through device enumeration, incorporates error handling mechanisms, and provides the ability to switch between preview and recording modes. ## Features - Capture video from screen to MP4 file - Video preview ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [ScreenSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/ScreenSourceBlock/) - captures video from the screen - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\macOS\SimpleVideoCapture\readme.es.md # Media Blocks SDK .Net - macOS Simple Video Capture Demo El código proporcionado es un ejemplo para crear una sencilla aplicación de captura de vídeo utilizando el SDK VisioForge Media Blocks en macOS. Demuestra cómo configurar una canalización de medios para capturar vídeo y audio desde dispositivos del sistema, renderizarlos en tiempo real y gestionar los permisos y selecciones de dispositivos. Entre las principales funciones se incluyen la solicitud de acceso a la cámara, la enumeración de fuentes de vídeo y audio, la selección de formatos y velocidades de fotogramas y la integración con la interfaz de usuario de macOS para la visualización de vídeo. El código aprovecha la potencia de la programación asíncrona para gestionar las operaciones del dispositivo y actualiza la interfaz de usuario en función del estado actual de la captura. Este ejemplo sirve como base para el desarrollo de aplicaciones multimedia más complejas en macOS utilizando el SDK MediaBlocks de VisioForge. ## Características - Vista previa del vídeo de la cámara - Captura de vídeo y audio a un archivo MP4 ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\macOS\SimpleVideoCapture\readme.md # Media Blocks SDK .Net - macOS Simple Video Capture Demo The provided code is a sample for creating a simple video capture application using the VisioForge Media Blocks SDK on macOS. It demonstrates how to set up a media pipeline for capturing video and audio from system devices, render them in real-time, and manage device permissions and selections. Key features include requesting camera access, enumerating video and audio sources, selecting formats and frame rates, and integrating with the macOS UI for displaying video. The code leverages the power of asynchronous programming to handle device operations and updates the UI based on the current capture state. This sample serves as a foundation for developing more complex media applications on macOS using VisioForge's MediaBlocks SDK. ## Features - Preview camera video - Capture video and audio to MP4 file ## Used blocks - [VideoRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoRendering/) - renders video - [SystemVideoSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemVideoSourceBlock/) - captures video from the webcam - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - splits the media stream into two paths - [H264EncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/VideoEncoders/H264EncoderBlock/) - encodes the video stream using H264 - [AACEncoderBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioEncoders/AACEncoderBlock/) - encodes the audio stream using AAC - [MP4SinkBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sinks/MP4SinkBlock/) - saves video to an MP4 file ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\MAUI\QRReader\readme.es.md # VisioForge Video Capture SDK .Net ## QR Reader Demo (MAUI) Este ejemplo de SDK muestra cómo implementar un lector de códigos QR multiplataforma utilizando VisioForge Video Capture SDK .Net. Muestra el proceso de inicialización del entorno de captura de vídeo, la solicitud de permisos de cámara y la gestión de eventos de detección de códigos de barras. El ejemplo incluye métodos para gestionar los dispositivos de la cámara, capturar la entrada de vídeo y responder dinámicamente a las detecciones de códigos de barras actualizando la interfaz de usuario. Además, cubre la gestión del ciclo de vida de la sesión de captura de vídeo, incluida la eliminación adecuada de recursos para garantizar cierres limpios. Este ejemplo está diseñado para funcionar en múltiples plataformas, incluyendo Android y macOS, aprovechando la compilación condicional. ## Versiones de .Net compatibles * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\MAUI\QRReader\readme.md # VisioForge Video Capture SDK .Net ## QR Reader Demo (MAUI) This SDK sample demonstrates how to implement a cross-platform QR code reader using the VisioForge Video Capture SDK .Net. It showcases the process of initializing the video capture environment, requesting camera permissions, and handling barcode detection events. The sample includes methods for managing camera devices, capturing video input, and dynamically responding to barcode detections by updating the UI. Additionally, it covers the lifecycle management of the video capture session, including proper resource disposal to ensure clean shutdowns. This example is designed to work across multiple platforms, including Android and macOS, by leveraging conditional compilation. ## Supported frameworks * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\MAUI\SimpleCapture\readme.es.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (MAUI) El ejemplo de código proporcionado demuestra el uso de VisioForge Video Capture SDK en una aplicación .NET MAUI para capturar vídeo y audio desde dispositivos conectados. Muestra cómo enumerar los dispositivos de vídeo y audio, solicitar permisos para acceder a la cámara y al micrófono, y gestionar la captura de vídeo y la función de vista previa. El código incluye la gestión de eventos para iniciar y detener la previsualización y captura de vídeo, ajustar el volumen de salida de audio y cambiar dinámicamente entre las cámaras y micrófonos disponibles. Además, demuestra los procedimientos de limpieza para disponer adecuadamente de los recursos y el SDK al cerrar la aplicación. ## Versiones de .Net compatibles * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\MAUI\SimpleCapture\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (MAUI) The provided code sample demonstrates the use of the VisioForge Video Capture SDK in a .NET MAUI application for capturing video and audio from connected devices. It showcases how to enumerate video and audio devices, request permissions for camera and microphone access, and handle video capture and preview functionality. The code includes event handling for starting and stopping the video preview and capture, adjusting audio output volume, and dynamically switching between available cameras and microphones. Additionally, it demonstrates clean-up procedures to properly dispose of resources and the SDK upon closing the application. ## Supported frameworks * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WinForms\RTSP MultiView Demo\readme.es.md # Video Capture SDK .Net - RTSP MultiView Demo (WinForms) Este ejemplo de SDK muestra un enfoque completo para integrar la transmisión de vídeo RTSP y MJPEG en una aplicación multiventana utilizando el SDK de captura de vídeo VisioForge .Net. Muestra la inicialización, reproducción y gestión dinámica de múltiples secuencias de vídeo en una aplicación Windows Forms. El ejemplo incluye la gestión de fuentes de vídeo y audio, la utilización de decodificadores de hardware y software para un rendimiento óptimo y la captura de secuencias de vídeo en archivos. Además, incluye una interfaz de usuario para seleccionar fuentes de vídeo, configurar los ajustes del descodificador y registrar información multimedia. Este ejemplo es ideal para desarrolladores que deseen implementar funciones avanzadas de transmisión de vídeo en sus aplicaciones .NET utilizando la tecnología de VisioForge. ## Características - Reproducción de múltiples secuencias RTSP - Captura de secuencias originales en disco - Captura en disco de secuencias recodificadas - Acceso a fotogramas de vídeo y audio RAW ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Video Capture SDK .Net product page](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WinForms\RTSP MultiView Demo\readme.md # Video Capture SDK .Net - RTSP MultiView Demo (WinForms) This SDK sample demonstrates a comprehensive approach to integrating RTSP and MJPEG video streaming within a multi-view application using the VisioForge Video Capture SDK .Net. It showcases initialization, playback, and dynamic management of multiple video streams in a Windows Forms application. The example includes handling video and audio sources, utilizing hardware and software decoders for optimal performance, and capturing video streams to files. Additionally, it features a user interface for selecting video sources, configuring decoder settings, and logging media information. This sample is ideal for developers looking to implement advanced video streaming capabilities in their .NET applications using VisioForge's technology. ## Features - Play multiple RTSP streams - Capture original streams to disk - Capture reencoded streams to disk - RAW video and audio frames access ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Video Capture SDK .Net product page](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Allied Vision Camera Demo\readme.es.md # Video Capture SDK .Net - Allied Vision Camera Demo (WPF) Allied Vision Camera Demo es una aplicación que utiliza Video Capture SDK .Net para previsualizar o capturar vídeo de las cámaras Allied Vision GigE/USB3/GenICam. ## Características - Reproducir vídeo desde la fuente de la cámara Allied Vision ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Allied Vision Camera Demo\readme.md # Video Capture SDK .Net - Allied Vision Camera Demo (WPF) Allied Vision Camera Demo is an application that uses the Video Capture SDK .Net to preview or capture video from Allied Vision GigE/USB3/GenICam cameras. ## Features - Play video from Allied Vision camera source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Audio Capture Demo X WPF\readme.es.md # VisioForge Video Capture SDK .Net ## Audio Capture Demo (C#/WPF, cross-platform) La muestra, desarrollada por VisioForge Video Capture SDK .Net, muestra un ejemplo completo de implementación de la funcionalidad de grabación de audio dentro de una aplicación Windows Presentation Foundation. Este ejemplo muestra cómo enumerar dispositivos de audio, seleccionar dispositivos de entrada y salida, configurar varios formatos de audio y gestionar el proceso de grabación con información en tiempo real sobre la duración de la grabación. Utiliza el SDK para gestionar la captura de audio, la enumeración de dispositivos y la configuración del formato de audio, ofreciendo una plataforma versátil para que los desarrolladores integren sofisticadas funciones de grabación de audio en sus aplicaciones. Mediante el manejo de eventos y las interacciones de la interfaz de usuario, proporciona una interfaz fácil de usar para la grabación de audio en formatos como MP3, WAV, FLAC, etc., haciendo hincapié en la facilidad de uso y la flexibilidad en las tareas de procesamiento de audio. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Audio Capture Demo X WPF\readme.md # VisioForge Video Capture SDK .Net ## Audio Capture Demo (C#/WPF, cross-platform) The sample, powered by the VisioForge Video Capture SDK .Net, showcases a comprehensive example of implementing audio recording functionality within a Windows Presentation Foundation application. This sample demonstrates how to enumerate audio devices, select input and output devices, configure various audio formats, and manage the recording process with real-time feedback on recording duration. It employs the SDK to handle audio capture, device enumeration, and audio format configuration, offering a versatile platform for developers to integrate sophisticated audio recording capabilities into their applications. Through event handling and UI interactions, it provides a user-friendly interface for recording audio in formats like MP3, WAV, FLAC, and more, emphasizing ease of use and flexibility in audio processing tasks. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Audio Mixer\readme.es.md # VisioForge Media Blocks SDK .Net ## Audio Capture Demo (C#/WPF, cross-platform) La muestra, desarrollada por VisioForge Media Blocks SDK .Net, muestra un ejemplo completo de implementación de la funcionalidad de grabación de audio dentro de una aplicación WPF. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-blocks-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Audio Mixer\readme.md # VisioForge Media Blocks SDK .Net ## Audio Capture Demo (C#/WPF, cross-platform) The sample, powered by the VisioForge Media Blocks SDK .Net, showcases a comprehensive example of implementing audio recording functionality within a WPF application. ## Used blocks - [SystemAudioSourceBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Sources/SystemAudioSourceBlock/) - captures audio from the system audio input device - [AudioRendererBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/AudioRendering/) - renders audio - [TeeBlock](https://www.visioforge.com/help/docs/dotnet/mediablocks/Special/TeeBlock/) - duplicates audio stream for recording and previewing - `MP3OutputBlock` - encodes and saves audio to an MP3 file ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/media-blocks-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Basler Camera Demo\readme.es.md # Video Capture SDK .Net - Basler Camera Demo (WPF) Basler Camera Demo es una aplicación que utiliza Video Capture SDK .Net para previsualizar o capturar vídeo de las cámaras Basler GigE/USB3/GenICam. ## Características - Reproducir vídeo desde la fuente de la cámara Basler ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Basler Camera Demo\readme.md # Video Capture SDK .Net - Basler Camera Demo (WPF) Basler Camera Demo is an application that uses the Video Capture SDK .Net to preview or capture video from Basler GigE/USB3/GenICam cameras. ## Features - Play video from Basler camera source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Decklink Demo X\readme.es.md # VisioForge Video Capture SDK .Net (Cross-platform engine) ## Decklink Demo X (C#/WPF) El código proporcionado es una muestra SDK para una aplicación de captura y grabación de vídeo utilizando VisioForge Video Capture SDK .Net. Demuestra cómo configurar fuentes de vídeo y audio, incluyendo dispositivos y formatos, utilizando la integración Decklink para tarjetas de captura de vídeo profesionales. La muestra incluye funcionalidad para configurar formatos de salida (MP4, WebM), manejo de errores y registro de eventos dentro de una aplicación WPF. Muestra cómo iniciar y detener la captura de vídeo, ajustar el volumen y guardar grabaciones con configuraciones personalizadas a través de una interfaz fácil de usar. Este ejemplo es valioso para los desarrolladores que deseen integrar funciones avanzadas de captura de vídeo en sus aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Decklink Demo X\readme.md # VisioForge Video Capture SDK .Net (Cross-platform engine) ## Decklink Demo X (C#/WPF) The provided code is an SDK sample for a video capture and recording application using VisioForge Video Capture SDK .Net. It demonstrates how to set up and configure video and audio sources, including devices and formats, using the Decklink integration for professional video capture cards. The sample includes functionality for configuring output formats (MP4, WebM), handling errors, and logging events within a WPF application. It showcases how to initiate and stop video capture, adjust volume, and save recordings with custom settings through a user-friendly interface. This example is valuable for developers looking to integrate advanced video capture features into their .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\GenICam Capture\readme.es.md # VisioForge Video Capture SDK .Net ## GenICam Capture Demo (C#/WPF) La demo muestra cómo previsualizar o capturar vídeo de cámaras que soportan el protocolo GenICam y están conectadas mediante USB 3 o un GigE. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\GenICam Capture\readme.md # VisioForge Video Capture SDK .Net ## GenICam Capture Demo (C#/WPF) The demo shows how to preview or capture video from cameras that support GenICam protocol and connected using USB 3 or a GigE. [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\IP Capture\readme.es.md # VisioForge Video Capture SDK .Net ## IP Capture Demo (C#/WPF, cross-platform engine) El código proporcionado es un ejemplo completo de una aplicación Windows desarrollada utilizando el VisioForge Video Capture SDK .Net, que está diseñada para capturar secuencias de cámaras IP y grabarlas en varios formatos. La aplicación cuenta con una interfaz de usuario para configurar los ajustes de captura, incluida la selección del formato de salida (MP4, AVI, WebM, MPEG-TS, MOV), el control ONVIF de la cámara para movimiento horizontal, vertical y zoom, y opciones para tomar instantáneas. También incluye cuadros de diálogo para configurar los ajustes de codificación de los distintos formatos de salida. La aplicación demuestra prácticas de programación asíncrona, manejo de eventos para capturar errores y desconexiones de la fuente de red, y el uso del SDK VisioForge para tareas de captura de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\IP Capture\readme.md # VisioForge Video Capture SDK .Net ## IP Capture Demo (C#/WPF, cross-platform engine) The provided code is a comprehensive example of a Windows application developed using the VisioForge Video Capture SDK .Net, which is designed for capturing IP camera streams and recording them in various formats. The application features a user interface for configuring capture settings, including output format selection (MP4, AVI, WebM, MPEG-TS, MOV), ONVIF camera control for pan, tilt, and zoom, and options for taking snapshots. It also includes dialogs for configuring encoding settings for different output formats. The application demonstrates asynchronous programming practices, event handling for capturing errors and network source disconnections, and the use of the VisioForge SDK for video capture tasks. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\NDI Source Demo\readme.es.md # Media Blocks SDK .Net - NDI Source Demo (WPF, cross-platform engine) Este ejemplo de SDK muestra cómo integrar la transmisión de fuentes NDI (Network Device Interface) en una aplicación Windows utilizando VisioForge Video Capture SDK .Net. La aplicación, construida con WPF (.NET), muestra la inicialización del SDK, el listado de fuentes NDI disponibles y el manejo de la captura de vídeo con funcionalidad de inicio y parada. Las características clave incluyen la gestión de errores, la visualización del tiempo de grabación y la limpieza de recursos de forma asíncrona. El fragmento de código cubre la configuración del motor de captura de vídeo, la selección y utilización de fuentes NDI, y la actualización dinámica de las marcas de tiempo de grabación, ilustrando la facilidad de incorporar capacidades de captura de vídeo profesional en aplicaciones .NET. ## Características - Captura de vídeo desde la fuente NDI a un archivo MP4 - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\NDI Source Demo\readme.md # Media Blocks SDK .Net - NDI Source Demo (WPF, cross-platform engine) This SDK sample demonstrates how to integrate NDI (Network Device Interface) source streaming into a Windows application using VisioForge Video Capture SDK .Net. The application, built with WPF (.NET), showcases initializing the SDK, listing available NDI sources, and handling video capture with start and stop functionality. Key features include error handling, displaying the recording time, and cleaning up resources asynchronously. The code snippet covers the setup of the video capture engine, selection and utilization of NDI sources, and the dynamic update of recording timestamps, illustrating the ease of incorporating professional video capture capabilities into .NET applications. ## Features - Capture video from the NDI source to an MP4 file - Video preview ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\NDI Streamer Demo\readme.es.md # Video Capture SDK .Net - NDI Streamer Demo (WPF, cross-platform engine) Este ejemplo demuestra la integración y el uso de VisioForge Video Capture SDK .Net dentro de una aplicación WPF para la transmisión de contenidos de vídeo y audio utilizando el protocolo NDI. Muestra cómo enumerar y seleccionar dispositivos de entrada de vídeo y audio, configurar los ajustes de origen de vídeo y gestionar los ajustes de captura de audio. Además, el ejemplo incluye funciones para iniciar y detener la captura de vídeo, actualizar dinámicamente los elementos de la interfaz de usuario con información del dispositivo y gestionar errores. El uso de patrones de programación asíncronos para la supervisión de dispositivos y las operaciones de captura garantiza la capacidad de respuesta y la eficiencia en el manejo de medios. ## Características - Transmisión de vídeo NDI desde la fuente de captura de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Video Capture SDK .Net product page](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\NDI Streamer Demo\readme.md # Video Capture SDK .Net - NDI Streamer Demo (WPF, cross-platform engine) This sample demonstrates the integration and use of the VisioForge Video Capture SDK .Net within a WPF application for streaming video and audio content using the NDI protocol. It showcases how to enumerate and select video and audio input devices, configure video source settings, and manage audio capture settings. Additionally, the sample includes functionality for starting and stopping video capture, dynamically updating UI elements with device information, and handling errors. The use of asynchronous programming patterns for device monitoring and capture operations ensures responsiveness and efficiency in media handling. ## Features - NDI video streaming from video capture source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Video Capture SDK .Net product page](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Networks Streamer\readme.es.md # Video Capture SDK .Net - Networks Streamer Demo (WPF, cross-platform engine) Este ejemplo muestra la integración de VisioForge Video Capture SDK .Net dentro de una aplicación WPF diseñada para streaming a redes sociales. Demuestra cómo enumerar y seleccionar dispositivos de entrada y salida de vídeo/audio, configurar las fuentes de vídeo y audio, y transmitir a plataformas como YouTube, Facebook Live y servidores RTMP genéricos. El código incluye la gestión de eventos para la enumeración de dispositivos y errores de transmisión, así como actualizaciones de la interfaz de usuario para la selección de dispositivos y el estado de la transmisión. Este ejemplo es ideal para desarrolladores que buscan construir aplicaciones con streaming en directo a varias plataformas de medios sociales utilizando la tecnología de VisioForge. ## Características - Transmisión de vídeo a YouTube - Transmisión de vídeo a Facebook Live - Transmisión de vídeo a un servidor RTMP - Previsualización de vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Video Capture SDK .Net product page](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Networks Streamer\readme.md # Video Capture SDK .Net - Networks Streamer Demo (WPF, cross-platform engine) This sample showcases the integration of VisioForge Video Capture SDK .Net within a WPF application designed for streaming to social networks. It demonstrates how to enumerate and select video/audio input and output devices, configure video and audio source settings, and stream to platforms like YouTube, Facebook Live, and generic RTMP servers. The code includes event handling for device enumeration and streaming errors, as well as UI updates for device selection and streaming status. This example is ideal for developers looking to build applications with live streaming to various social media platforms using VisioForge's technology. ## Features - Video streaming to YouTube - Video streaming to Facebook Live - Video streaming to the RTMP server - Video preview ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Video Capture SDK .Net product page](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Overlay Manager Demo\readme.es.md # Media Blocks SDK .Net - Overlay Manager Demo (WPF, cross-platform engine) El ejemplo proporcionado para VisioForge Video Capture SDK .Net muestra cómo implementar un gestor de captura y superposición de vídeo en una aplicación WPF. Abarca la inicialización del SDK, la enumeración de dispositivos de vídeo, la configuración de los ajustes de origen de vídeo y la gestión de superposiciones de vídeo como imágenes, texto, líneas, rectángulos y círculos. El ejemplo también permite iniciar y detener la captura de vídeo, ajustar dinámicamente la velocidad de fotogramas y gestionar las superposiciones en tiempo real. Muestra las capacidades del SDK para gestionar tareas complejas de procesamiento de vídeo, lo que lo convierte en un valioso recurso para los desarrolladores que crean aplicaciones multimedia. ## Características - Reproducir archivos multimedia - Añadir superposiciones al vídeo ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Overlay Manager Demo\readme.md # Media Blocks SDK .Net - Overlay Manager Demo (WPF, cross-platform engine) The provided sample for the VisioForge Video Capture SDK .Net demonstrates how to implement a video capture and overlay manager in a WPF application. It covers initializing the SDK, enumerating video devices, configuring video source settings, and managing video overlays like images, text, lines, rectangles, and circles. The sample also handles starting and stopping video capture, dynamically adjusting frame rates, and managing overlays in real-time. It showcases the SDK's capabilities in handling complex video processing tasks, making it a valuable resource for developers building multimedia applications. ## Features - Play media files - Add overlays to the video ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Media Blocks SDK .Net product page](https://www.visioforge.com/media-blocks-sdk) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Screen_Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Captura de Pantalla Demo X (C#/WPF, motor multiplataforma) El código proporcionado muestra un ejemplo de SDK .Net de captura de vídeo con muchas funciones utilizado para capturar actividades en pantalla, incluido el soporte para múltiples formatos de salida como MP4, AVI, WebM, MOV y MPEG-TS. Los usuarios pueden configurar las fuentes de vídeo y audio, seleccionar los formatos de salida y gestionar las dimensiones de la captura de pantalla y la visibilidad del cursor. Además, incluye funciones para la selección del dispositivo de captura de audio y la configuración del formato, mostrando una arquitectura integrada basada en eventos para la gestión de medios en tiempo real. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Screen_Capture\readme.md # VisioForge Video Capture SDK .Net ## Screen Capture Demo X (C#/WPF, cross-platform engine) The provided code showcases a feature-rich Video Capture SDK .Net sample used to capture screen activities, including support for multiple output formats such as MP4, AVI, WebM, MOV, and MPEG-TS. Users can configure video and audio source settings, select output formats, and manage screen capture dimensions and cursor visibility. Additionally, it includes functionality for audio capture device selection and format configuration, showcasing an integrated, event-driven architecture for real-time media handling. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Simple Video Capture\readme.es.md # VisioForge Video Capture SDK .Net ## Demo simple de captura de vídeo (C#/WPF, motor multiplataforma) El código proporcionado es una sencilla aplicación de captura de vídeo realizada utilizando el VisioForge Video Capture SDK .Net, que está diseñado para trabajar con .NET Framework y WPF para la creación de aplicaciones multimedia. La aplicación inicializa el motor de captura de vídeo al arrancar, enumera los dispositivos de vídeo y audio y permite a los usuarios seleccionar los dispositivos de entrada y configurar los ajustes de salida para varios formatos como MP4, AVI, WebM, MPEG-TS y MOV. Incluye efectos de vídeo en tiempo real, como escala de grises, sepia, voltear, rotar y superponer texto o imágenes. La interfaz de usuario ofrece opciones para iniciar, pausar, reanudar y detener grabaciones, ajustar el volumen del audio y guardar instantáneas. Además, admite la configuración de formatos de fuente de vídeo, velocidades de fotogramas y formatos de entrada de audio, lo que demuestra la flexibilidad y las capacidades del SDK de VisioForge para proyectos de procesamiento y captura multimedia. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Simple Video Capture\readme.md # VisioForge Video Capture SDK .Net ## Simple Video Capture Demo (C#/WPF, cross-platform engine) The provided code is a simple video capture application made using the VisioForge Video Capture SDK .Net, which is designed to work with .NET Framework and WPF for creating multimedia applications. The application initializes the video capture engine on startup, enumerates video and audio devices, and allows users to select input devices and configure output settings for various formats like MP4, AVI, WebM, MPEG-TS, and MOV. It features real-time video effects, including grayscale, sepia, flip, rotate, and text or image overlays. The user interface provides options for starting, pausing, resuming, and stopping recordings, adjusting audio volume, and saving snapshots. Additionally, it supports configuring video source formats, frame rates, and audio input formats, showcasing the flexibility and capabilities of the VisioForge SDK for multimedia processing and capture projects. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\USB3V-GigE Spinnaker\readme.es.md # Video Capture SDK .Net - USB3V-GigE Spinnaker (FLIR/Teledyne) Demo (WPF) Spinnaker Source Demo es una aplicación que utiliza Video Capture SDK .Net para previsualizar o capturar vídeo desde cámaras que soporten Spinnaker SDK y estén conectadas mediante USB 3 o un GigE. ## Características - Reproducir vídeo desde una fuente compatible con Spinnaker SDK ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Video Capture SDK .Net product page](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\USB3V-GigE Spinnaker\readme.md # Video Capture SDK .Net - USB3V-GigE Spinnaker (FLIR/Teledyne) Demo (WPF) Spinnaker Source Demo is an application that uses the Video Capture SDK .Net to preview or capture video from cameras that support Spinnaker SDK and are connected using USB 3 or a GigE. ## Features - Play video from Spinnaker SDK-supported source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Video Capture SDK .Net product page](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Video Compositor Demo\readme.es.md # Video Capture SDK .Net - Video Compositor Demo (WPF, cross-platform engine) Este ejemplo muestra cómo utilizar VisioForge Video Capture SDK .Net para crear una aplicación de composición de vídeo en un entorno WPF. Muestra cómo inicializar el SDK, configurar la captura de vídeo con varias fuentes (cámaras, pantallas), configurar los ajustes del mezclador de vídeo y gestionar las salidas de audio/vídeo a archivos o transmisiones en directo (YouTube, Facebook Live). La aplicación incluye controles de interfaz de usuario para iniciar/detener la grabación, ajustar las propiedades de la fuente (por ejemplo, transparencia, posición) y registrar errores. Utiliza programación asíncrona para una gestión eficaz de los recursos, lo que pone de relieve la capacidad del SDK para tareas complejas de procesamiento de vídeo en aplicaciones en tiempo real. ## Características - Mezcla de varias fuentes de vídeo - Previsualización de vídeo - Streaming en red ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\Video Compositor Demo\readme.md # Video Capture SDK .Net - Video Compositor Demo (WPF, cross-platform engine) This sample demonstrates how to use the VisioForge Video Capture SDK .Net to create a video compositor application in a WPF environment. It showcases initializing the SDK, setting up video capture with various sources (cameras, screens), configuring video mixer settings, and managing audio/video outputs to files or live streams (YouTube, Facebook Live). The application includes UI controls for starting/stopping recording, adjusting source properties (e.g., transparency, position), and logging errors. It utilizes asynchronous programming for efficient resource management, highlighting the SDK's capability for complex video processing tasks in real-time applications. ## Features - Video mixing of several video sources - Video preview - Network streaming ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\VNC Source Demo\readme.es.md # Video Capture SDK .Net - VNC Source Demo (WPF, cross-platform engine) Este ejemplo de SDK demuestra la integración del SDK de captura de vídeo VisioForge .Net en una aplicación WPF para capturar vídeo desde una fuente VNC. Inicializa el SDK de VisioForge, configura la captura de vídeo con gestión de errores y permite iniciar y detener el proceso de captura de vídeo. El código incluye una interfaz de usuario con botones para controlar el proceso de captura y campos para configurar los ajustes de la fuente VNC, como el host, el puerto, la URL y la contraseña. También demuestra el uso de programación asíncrona para gestionar el ciclo de vida de la captura de vídeo y la eliminación de recursos. ## Características - Reproducir vídeo desde una fuente VNC/RFB ## Versiones de .Net compatibles - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\WPF\CSharp\VNC Source Demo\readme.md # Video Capture SDK .Net - VNC Source Demo (WPF, cross-platform engine) This SDK sample demonstrates the integration of the VisioForge Video Capture SDK .Net into a WPF application for capturing video from a VNC source. It initializes the VisioForge SDK, sets up video capture with error handling, and allows for starting and stopping the video capture process. The code includes a UI with buttons to control the capture process and fields for configuring the VNC source settings, such as host, port, URL, and password. It also demonstrates the use of asynchronous programming to manage the video capture lifecycle and the disposal of resources. ## Features - Play video from VNC/RFB source ## Supported frameworks - .Net 4.7.2 - .Net Core 3.1 - .Net 5 - .Net 6 - .Net 7 - .Net 8 - .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\ip-camera-capture-mp4\readme.es.md # Video Capture SDK .Net - IP camera capture to MP4 code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar vídeo de una cámara IP y guardarlo en un archivo MP4 utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\ip-camera-capture-mp4\readme.md # Video Capture SDK .Net - IP camera capture to MP4 code snippet (C#/WinForms) This code snippet demonstrates how to capture video from an IP camera and save it to an MP4 file using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\ip-camera-preview\readme.es.md # Video Capture SDK .Net - IP camera preview code snippet (C#/WinForms) Este fragmento de código muestra cómo previsualizar vídeo de una cámara IP utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\ip-camera-preview\readme.md # Video Capture SDK .Net - IP camera preview code snippet (C#/WinForms) This code snippet demonstrates how to preview video from an IP camera using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\screen-capture-avi\readme.es.md # Video Capture SDK .Net - Screen capture to AVI code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar la pantalla y guardarla en un archivo AVI utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\screen-capture-avi\readme.md # Video Capture SDK .Net - Screen capture to AVI code snippet (C#/WinForms) This code snippet demonstrates how to capture the screen and save it to an AVI file using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\screen-capture-mp4\readme.es.md # Video Capture SDK .Net - Screen capture to MP4 code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar la pantalla y guardarla en un archivo MP4 utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\screen-capture-mp4\readme.md # Video Capture SDK .Net - Screen capture to MP4 code snippet (C#/WinForms) This code snippet demonstrates how to capture the screen and save it to an MP4 file using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\screen-capture-webm\readme.es.md # Video Capture SDK .Net - Screen capture to MP4 code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar la pantalla y guardarla en un archivo MP4 utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\screen-capture-webm\readme.md # Video Capture SDK .Net - Screen capture to MP4 code snippet (C#/WinForms) This code snippet demonstrates how to capture the screen and save it to an MP4 file using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\screen-capture-wmv\readme.es.md # Video Capture SDK .Net - Screen capture to WMV code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar la pantalla y guardarla en un archivo WMV utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\screen-capture-wmv\readme.md # Video Capture SDK .Net - Screen capture to WMV code snippet (C#/WinForms) This code snippet demonstrates how to capture the screen and save it to a WMV file using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\speaker-capture\readme.es.md # Video Capture SDK .Net - Speaker capture code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar audio de un altavoz utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\speaker-capture\readme.md # Video Capture SDK .Net - Speaker capture code snippet (C#/WinForms) This code snippet demonstrates how to capture audio from a speaker using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-image-overlay\readme.es.md # Video Capture SDK .Net - Captura de vídeo con superposición de imagen fragmento de código (C#/WinForms) Este fragmento de código muestra cómo capturar vídeo de una cámara web y superponer una imagen sobre el vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-image-overlay\readme.md # Video Capture SDK .Net - Video capture with image overlay code snippet (C#/WinForms) This code snippet demonstrates how to capture video from a webcam and overlay an image on top of the video. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-text-overlay\readme.es.md # Video Capture SDK .Net - Video capture with text overlay code snippet (C#/WinForms) Este fragmento de código muestra cómo capturar vídeo desde un dispositivo de captura de vídeo y superponer texto en el vídeo utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-text-overlay\readme.md # Video Capture SDK .Net - Video capture with text overlay code snippet (C#/WinForms) This code snippet demonstrates how to capture video from a video capture device and overlay text on the video using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-webcam-avi\readme.es.md # Video Capture SDK .Net - Video capture to AVI code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar vídeo desde un dispositivo de captura de vídeo y guardarlo en un archivo AVI utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-webcam-avi\readme.md # Video Capture SDK .Net - Video capture to AVI code snippet (C#/WinForms) This code snippet demonstrates how to capture video from a video capture device and save it to an AVI file using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-webcam-mp4\readme.es.md # Video Capture SDK .Net - Video capture to MP4 code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar vídeo de una cámara web y guardarlo en un archivo MP4 utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-webcam-mp4\readme.md # Video Capture SDK .Net - Video capture to MP4 code snippet (C#/WinForms) This code snippet demonstrates how to capture video from a webcam and save it to an MP4 file using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-webcam-wmv\readme.es.md # Video Capture SDK .Net - Video capture to WMV code snippet (C#/WinForms) Este fragmento de código demuestra cómo capturar vídeo desde un dispositivo de captura de vídeo y guardarlo en un archivo WMV utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-capture-webcam-wmv\readme.md # Video Capture SDK .Net - Video capture to WMV code snippet (C#/WinForms) This code snippet demonstrates how to capture video from a video capture device and save it to a WMV file using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-preview-webcam-frame-capture\readme.es.md # Video Capture SDK .Net - Video preview from a webcam with a frame capture code snippet (C#/WinForms) Este fragmento de código muestra cómo previsualizar vídeo de una cámara web y capturar fotogramas del vídeo utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\video-preview-webcam-frame-capture\readme.md # Video Capture SDK .Net - Video preview from a webcam with a frame capture code snippet (C#/WinForms) This code snippet demonstrates how to preview video from a webcam and capture frames from the video using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\webcam-preview\readme.es.md # Video Capture SDK .Net - Video preview from a webcam code snippet (C#/WinForms) Este fragmento de código muestra cómo previsualizar vídeo de una cámara web utilizando Video Capture SDK .Net. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Capture SDK X\_CodeSnippets\webcam-preview\readme.md # Video Capture SDK .Net - Video preview from a webcam code snippet (C#/WinForms) This code snippet demonstrates how to preview video from a webcam using Video Capture SDK .Net. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-capture-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\Console\CSharp\Main Demo CLI\readme.es.md # VisioForge Video Edit SDK .Net ## Demo Principal (C#/Consola) La demo muestra todas las características principales de Video Edit SDK .Net. Características: * previsualización de vídeo * edición y conversión de video * aplicar efectos de video y audio * transmisión en red * guardar vídeo y audio en varios formatos de salida * Aplicar OSD * Utilizar Picture-in-Picture * Detectar movimiento * muchas otras funciones disponibles ### Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\Console\CSharp\Main Demo CLI\readme.md # VisioForge Video Edit SDK .Net ## Main Demo (C#/Console) The demo shows all the main features of Video Edit SDK .Net. Features: * video preview * video editing and conversion * apply video and audio effects * network streaming * save video and audio to various output formats * apply OSD * use Picture-in-Picture * detect motion * many other features are available ### Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\Console\CSharp\Video From Images CLI\readme.es.md # VisioForge Video Edit SDK .Net ## Simple Video Player Avalonia Demo Este ejemplo del SDK muestra cómo crear un vídeo a partir de una colección de imágenes utilizando el VisioForge Video Edit SDK .Net en una aplicación de consola. El programa escanea una carpeta especificada en busca de imágenes con formatos comunes (JPG, PNG, BMP, GIF, TIFF), y luego las ordena secuencialmente para producir un archivo de vídeo. Los usuarios pueden personalizar los parámetros de salida, como la resolución, el formato (MP4, AVI, WMV) y la frecuencia de imagen. La aplicación admite argumentos de línea de comandos para facilitar la automatización y la integración en flujos de trabajo más amplios. Se incluye la gestión de errores y los informes de progreso para garantizar un funcionamiento sin problemas. ### Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\Console\CSharp\Video From Images CLI\readme.md # VisioForge Video Edit SDK .Net ## Simple Video Player Avalonia Demo This SDK sample demonstrates how to create a video from a collection of images using the VisioForge Video Edit SDK .Net in a console application. The program scans a specified folder for images with common formats (JPG, PNG, BMP, GIF, TIFF), then arranges them sequentially to produce a video file. Users can customize output settings such as resolution, format (MP4, AVI, WMV), and frame rate. The application supports command-line arguments for easy automation and integration into larger workflows. Error handling and progress reporting are included to ensure a smooth operation. ### Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Audio Extractor\readme.es.md # VisioForge Video Edit SDK .Net ## Demostración del Extractor de Audio (C#/WinForms) Este ejemplo del SDK muestra cómo crear una sencilla aplicación extractora de audio utilizando el SDK VisioForge Video Edit .Net en una aplicación Windows Forms. Muestra las capacidades para seleccionar archivos de vídeo de entrada, especificar ubicaciones de archivos de audio de salida y extraer o recodificar secuencias de audio en formatos populares como MP3 o M4A. La aplicación ofrece una interfaz de usuario con opciones para iniciar y detener el proceso de extracción, mostrar el progreso y gestionar los errores con elegancia. Además, incluye funciones para abrir enlaces a los descodificadores necesarios, lo que garantiza que los usuarios dispongan de las herramientas necesarias para procesar distintos tipos de medios. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Audio Extractor\readme.md # VisioForge Video Edit SDK .Net ## Audio Extractor Demo (C#/WinForms) This SDK sample demonstrates how to create a simple audio extractor application using the VisioForge Video Edit SDK .Net in a Windows Forms application. It showcases the capabilities for selecting input video files, specifying output audio file locations, and extracting or re-encoding audio streams into popular formats like MP3 or M4A. The application provides a user interface with options to start and stop the extraction process, display progress, and handle errors gracefully. Additionally, it includes functionality to open links to necessary decoders, ensuring users have the tools needed to process various media types. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\File Encryptor\readme.es.md # VisioForge Video Edit SDK .Net ## Demostración de cifrado de archivos (C#/WinForms) Este ejemplo de SDK muestra la integración de VisioForge Video Edit SDK .Net para crear una aplicación de encriptación de archivos en un entorno Windows Forms. La aplicación, llamada File Encryptor, demuestra tanto la encriptación rápida para códecs específicos (H264 para vídeo y AAC para audio) como las capacidades completas de recodificación para otros formatos. Presenta una interfaz fácil de usar que permite seleccionar los archivos de entrada y salida, muestra el progreso de la encriptación y proporciona información en caso de finalización o error. El ejemplo aprovecha las funciones avanzadas del SDK, como la lógica de cifrado condicional y la recuperación de información multimedia. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\File Encryptor\readme.md # VisioForge Video Edit SDK .Net ## File Encryptor Demo (C#/WinForms) This SDK sample showcases the integration of VisioForge Video Edit SDK .Net for creating a file encryption application within a Windows Forms environment. The application, named File Encryptor, demonstrates both fast encryption for specific codecs (H264 for video and AAC for audio) and full re-encoding capabilities for other formats. It features a user-friendly interface allowing for the selection of input and output files, displays encryption progress, and provides feedback upon completion or errors. The sample leverages advanced features of the SDK such as conditional encryption logic and media information retrieval. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Main Demo\readme.es.md # VisioForge Video Edit SDK .Net ## Demo Principal (C#/WinForms) La demo muestra las características generales de Video Edit SDK .Net. Usted puede: * previsualización de vídeo * edición y conversión de vídeo * aplicar efectos de vídeo y audio * streaming en red * guardar vídeo y audio en varios formatos de salida * Aplicar OSD * Utilizar Picture-in-Picture * Detectar movimiento * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Main Demo\readme.md # VisioForge Video Edit SDK .Net ## Main Demo (C#/WinForms) The demo shows the general features of Video Edit SDK .Net. You can: * video preview * video editing and conversion * apply video and audio effects * network streaming * save video and audio to various output formats * apply OSD * use Picture-in-Picture * detect motion * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Video from images\readme.es.md # VisioForge Video Edit SDK .Net ## Demostración de vídeo a partir de imágenes (C#/WinForms) El código proporcionado es un ejemplo completo de una aplicación Windows Forms que utiliza el VisioForge Video Edit SDK .NET para crear vídeos a partir de imágenes. Demuestra la inicialización del entorno de edición de vídeo, incluyendo la carga de varios archivos multimedia (imágenes y audio), la configuración de los ajustes de salida de vídeo para diferentes formatos (por ejemplo, AVI, MP4, WMV, GIF), la aplicación de efectos de vídeo (por ejemplo, cambiar el tamaño, voltear, escala de grises), y el manejo de eventos tales como actualizaciones de progreso y errores. La aplicación ofrece una interfaz gráfica de usuario para añadir archivos de entrada, configurar los ajustes de salida y controlar el proceso de edición de vídeo (inicio, parada). Esta muestra es ideal para desarrolladores que buscan integrar funciones avanzadas de edición de vídeo en sus aplicaciones .NET, proporcionando una base sólida para la personalización y expansión. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Video from images\readme.md # VisioForge Video Edit SDK .Net ## Video From Images Demo (C#/WinForms) The provided code is a comprehensive example of a Windows Forms application using the VisioForge Video Edit SDK .NET to create videos from images. It demonstrates the initialization of the video editing environment, including loading various media files (images and audio), configuring video output settings for different formats (e.g., AVI, MP4, WMV, GIF), applying video effects (e.g., resize, flip, grayscale), and handling events such as progress updates and errors. The application offers a GUI for adding input files, configuring output settings, and controlling the video editing process (start, stop). This sample is ideal for developers looking to integrate advanced video editing features into their .NET applications, providing a solid foundation for customization and expansion. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Video from images in memory\readme.es.md # VisioForge Video Edit SDK .Net ## Video From Images In Memory Demo (C#/WinForms) La muestra del SDK proporciona una solución completa para crear y editar vídeos a partir de imágenes utilizando el VisioForge Video Edit SDK .Net. Incluye una amplia gama de cuadros de diálogo de configuración del formato de salida, como MP4, AVI, WMV y GIF, entre otros, lo que permite personalizar la salida de vídeo según las necesidades específicas. La aplicación permite añadir varios efectos de vídeo y ajustar propiedades como el brillo, el contraste, la saturación, etc., directamente desde la interfaz de usuario. Además, incorpora funciones para iniciar, detener y supervisar el progreso de las tareas de procesamiento de vídeo, mostrando las capacidades del SDK para integrar funciones avanzadas de edición de vídeo en aplicaciones .NET. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Video from images in memory\readme.md # VisioForge Video Edit SDK .Net ## Video From Images In Memory Demo (C#/WinForms) The SDK sample provides a comprehensive solution for creating and editing videos from images using the VisioForge Video Edit SDK .Net. It includes a wide range of output format settings dialogs, such as MP4, AVI, WMV, and GIF, among others, enabling customization of video output according to specific needs. The application supports adding various video effects and adjusting properties like brightness, contrast, saturation, and more, directly within the user interface. Additionally, it incorporates functionality to start, stop, and monitor the progress of video processing tasks, showcasing the SDK's capabilities for integrating advanced video editing features into .NET applications. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Video Join Demo\readme.es.md # VisioForge Video Edit SDK .Net ## Video Join Demo (C#/WinForms) El código proporcionado es un ejemplo completo del VisioForge Video Edit SDK .Net, que demuestra cómo crear una aplicación Windows Forms para la edición de vídeo. Muestra la configuración y el uso de varios formatos de salida para el procesamiento de vídeo y audio, como MP4, AVI, WMV, MP3 y GIF, entre otros. La aplicación permite a los usuarios añadir archivos de entrada, configurar los ajustes de salida y controlar el proceso de edición de vídeo, incluyendo el inicio, la parada y el seguimiento del progreso. El código incluye el manejo de eventos para errores, actualizaciones de progreso y finalización, garantizando una interfaz sensible y fácil de usar para las tareas de edición de vídeo. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\CSharp\Video Join Demo\readme.md # VisioForge Video Edit SDK .Net ## Video Join Demo (C#/WinForms) The provided code is a comprehensive example from the VisioForge Video Edit SDK .Net, demonstrating how to create a Windows Forms application for video editing. It showcases the setup and use of various output formats for video and audio processing, such as MP4, AVI, WMV, MP3, and GIF, among others. The application allows users to add input files, configure output settings, and control the video editing process, including start, stop, and monitoring progress. The code includes event handling for errors, progress updates, and completion, ensuring a responsive and user-friendly interface for video editing tasks. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\VB.Net\Main Demo\readme.es.md # VisioForge Video Edit SDK .Net ## Demo Principal (VB.Net/WinForms) La demo muestra las principales capacidades de Video Edit SDK .Net: * previsualización de video * edición y conversión de video * Aplicar efectos de vídeo y audio * streaming en red * guardar vídeo y audio en varios formatos de salida * Aplicar OSD * Utilizar Picture-in-Picture * Detectar movimiento * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WinForms\VB.Net\Main Demo\readme.md # VisioForge Video Edit SDK .Net ## Main Demo (VB.Net/WinForms) The demo shows the major capabilities of Video Edit SDK .Net: * video preview * video editing and conversion * apply video and audio effects * network streaming * save video and audio to various output formats * apply OSD * use Picture-in-Picture * detect motion * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WPF\CSharp\Main Demo\readme.es.md # VisioForge Video Edit SDK .Net ## Demo Principal (C#/WPF) La demo muestra las características generales de Video Edit SDK .Net: * previsualización de vídeo * edición y conversión de vídeo * aplicación de efectos de vídeo y audio * streaming en red * guardar vídeo y audio en varios formatos de salida * Aplicar OSD * Utilizar Picture-in-Picture * Detectar movimiento * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WPF\CSharp\Main Demo\readme.md # VisioForge Video Edit SDK .Net ## Main Demo (C#/WPF) The demo shows general features of Video Edit SDK .Net: * video preview * video editing and conversion * apply video and audio effects * network streaming * save video and audio to various output formats * apply OSD * use Picture-in-Picture * detect motion * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WPF\CSharp\Multiple Audio Tracks Demo\readme.es.md # VisioForge Video Edit SDK .Net ## Demostración de múltiples pistas de audio (C#/WPF) La Demostración de Múltiples Pistas de Audio, utilizando el VisioForge Video Edit SDK .NET, muestra una aplicación WPF diseñada para manipular archivos de vídeo y audio. Este ejemplo muestra cómo abrir archivos de vídeo y audio, seleccionar destinos de salida y manejar múltiples pistas de audio en un único archivo de vídeo. Presenta una GUI intuitiva para añadir vídeo y dos pistas de audio separadas, fusionándolos en un único archivo de salida con opciones de formato configurables. La aplicación también incluye información sobre el progreso en tiempo real y gestión de errores, lo que ilustra las capacidades del SDK para tareas complejas de edición de vídeo, como añadir, editar y exportar contenido multimedia. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK\WPF\CSharp\Multiple Audio Tracks Demo\readme.md # VisioForge Video Edit SDK .Net ## Multiple Audio Tracks Demo (C#/WPF) The Multiple Audio Tracks Demo, utilizing the VisioForge Video Edit SDK .NET, showcases a WPF application designed to manipulate video and audio files. This example demonstrates how to open video and audio files, select output destinations, and handle multiple audio tracks in a single video file. It features an intuitive GUI for adding video and two separate audio tracks, merging them into a single output file with configurable format options. The application also includes real-time progress feedback and error handling, illustrating the SDK's capabilities for complex video editing tasks such as adding, editing, and exporting multimedia content. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\Avalonia\VideoJoin\readme.es.md # VisioForge Video Edit SDK .Net ## Video Join Demo (motor multiplataforma) El código proporcionado es para una aplicación de edición de vídeo construida usando Avalonia, un framework .NET UI multiplataforma, y el SDK .Net de VisioForge Video Edit. Esta aplicación permite a los usuarios unir múltiples archivos de vídeo, ofreciendo funciones como la selección de archivos de entrada, la configuración del formato de salida, el ajuste de la velocidad de fotogramas y la supervisión del progreso a través de una interfaz gráfica. Soporta varios formatos de salida como MP4, WebM, AVI, MKV y más. La aplicación también incluye gestión de errores y opciones de depuración, mostrando la integración de las capacidades de edición de vídeo de VisioForge con los componentes de interfaz de usuario de Avalonia para una experiencia de procesamiento de vídeo sin fisuras. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\Avalonia\VideoJoin\readme.md # VisioForge Video Edit SDK .Net ## Video Join Demo (cross-platform engine) The provided code is for a video editing application built using Avalonia, a cross-platform .NET UI framework, and the VisioForge Video Edit SDK .Net. This application allows users to join multiple video files, offering features such as selecting input files, setting output format, adjusting frame rates, and monitoring progress through a graphical interface. It supports various output formats like MP4, WebM, AVI, MKV, and more. The application also includes error handling and debugging options, showcasing the integration of VisioForge's video editing capabilities with Avalonia's UI components for a seamless video processing experience. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\Console\CSharp\Video From Images X CLI\readme.es.md # VisioForge Video Edit SDK .Net ## Vídeo a partir de imágenes X CLI Demo Este ejemplo del SDK muestra cómo crear un vídeo a partir de una colección de imágenes utilizando VisioForge Video Edit SDK .Net en una aplicación de consola. El programa escanea un directorio especificado en busca de archivos de imagen de varios formatos (JPG, JPEG, PNG, BMP, GIF, TIF), y luego los añade secuencialmente a un archivo de vídeo con ajustes de salida personalizables como el tamaño del vídeo, la velocidad de fotogramas y el formato (MP4, AVI, WMV). El proceso incluye la inicialización del editor de vídeo, la configuración de las propiedades de vídeo, la adición de imágenes como fotogramas de vídeo, la configuración del formato de salida y la gestión de eventos de progreso, error y finalización. El ejemplo incluye el análisis de argumentos de la línea de comandos para una configuración flexible de la entrada y la salida. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\Console\CSharp\Video From Images X CLI\readme.md # VisioForge Video Edit SDK .Net ## Video From Images X CLI Demo This SDK sample demonstrates how to create a video from a collection of images using VisioForge Video Edit SDK .Net in a console application. The program scans a specified directory for image files of various formats (JPG, JPEG, PNG, BMP, GIF, TIF), and then sequentially adds them to a video file with customizable output settings such as video size, frame rate, and format (MP4, AVI, WMV). The process involves initializing the video editor, configuring video properties, adding images as video frames, setting the output format, and handling events for progress, error, and completion. The sample includes command-line argument parsing for flexible input and output configuration. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\WinForms\CSharp\Main Demo X\readme.es.md # VisioForge Video Edit SDK .Net ## Demo Principal (C#/WinForms, motor multiplataforma) La demo muestra las principales características de Video Edit SDK .Net: * previsualización de vídeo * edición y conversión de vídeo * aplicación de efectos de vídeo y audio * transmisión en red * guardar vídeo y audio en varios formatos de salida * Aplicar OSD * Utilizar Picture-in-Picture * Detectar movimiento * muchas otras funciones disponibles ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\WinForms\CSharp\Main Demo X\readme.md # VisioForge Video Edit SDK .Net ## Main Demo (C#/WinForms, cross-platform engine) The demo shows the main features of Video Edit SDK .Net: * video preview * video editing and conversion * apply video and audio effects * network streaming * save video and audio to various output formats * apply OSD * use Picture-in-Picture * detect motion * many other features are available ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\WinForms\CSharp\Video from images X\readme.es.md # VisioForge Video Edit SDK .Net ## Video From Images Demo (C#/WinForms, motor multiplataforma) Este ejemplo muestra cómo utilizar el VisioForge Video Edit SDK .Net para crear un vídeo a partir de una serie de imágenes. Muestra la inicialización del motor `VideoEditCoreX`, añadiendo imágenes como archivos de entrada, configurando parámetros de salida como el tamaño del vídeo y la velocidad de fotogramas, y manejando eventos como actualizaciones de progreso y errores. La interfaz de usuario incluye opciones para seleccionar imágenes de entrada, ajustar la configuración de vídeo e iniciar o detener el proceso de creación de vídeo. Además, el ejemplo proporciona una forma de ver tutoriales de vídeo y especifica cómo manejar el cierre de la aplicación disponiendo adecuadamente de los recursos del SDK. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\WinForms\CSharp\Video from images X\readme.md # VisioForge Video Edit SDK .Net ## Video From Images Demo (C#/WinForms, cross-platform engine) This sample demonstrates how to use the VisioForge Video Edit SDK .Net to create a video from a series of images. It showcases the initialization of the `VideoEditCoreX` engine, adding images as input files, configuring output parameters like video size and frame rate, and handling events such as progress updates and errors. The UI includes options to select input images, adjust video settings, and start or stop the video creation process. Additionally, the sample provides a way to view video tutorials and specifies how to handle the application's closure by properly disposing of the SDK resources. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\WinForms\CSharp\Video Join Demo X\readme.es.md # VisioForge Video Edit SDK .Net ## Video Join Demo (C#/WinForms, motor multiplataforma) La muestra proporciona una solución completa para editar y unir archivos de vídeo, audio e imagen dentro de una aplicación Windows Forms. Aprovechando el VisioForge Video Edit SDK, ofrece un rico conjunto de características, incluyendo soporte para múltiples formatos de archivo, ajuste de velocidad de fotogramas, y la configuración de salida personalizable. Los desarrolladores pueden integrar fácilmente la selección de entrada de archivos, la retroalimentación del progreso en tiempo real y el manejo de errores a través de la programación basada en eventos. El ejemplo muestra la inicialización del motor de edición, la adición de archivos multimedia, la configuración de los formatos de salida y el control del proceso de edición a través de una interfaz fácil de usar. ## Versiones de .Net compatibles * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\codebase\_DEMOS\Video Edit SDK X\WinForms\CSharp\Video Join Demo X\readme.md # VisioForge Video Edit SDK .Net ## Video Join Demo (C#/WinForms, cross-platform engine) The sample provides a comprehensive solution for editing and joining video, audio, and image files within a Windows Forms application. Leveraging the VisioForge Video Edit SDK, it offers a rich set of features, including support for multiple file formats, frame rate adjustment, and customizable output settings. Developers can easily integrate file input selection, real-time progress feedback, and error handling through event-driven programming. The sample demonstrates initializing the editing engine, adding media files, configuring output formats, and controlling the editing process through a user-friendly interface. ## Supported frameworks * .Net 4.7.2 * .Net Core 3.1 * .Net 5 * .Net 6 * .Net 7 * .Net 8 * .Net 9 --- [Visit the product page.](https://www.visioforge.com/video-edit-sdk-net) ---END OF PAGE--- # Local File: .\delphi\index.md --- title: Delphi Media Framework for Video Processing description: Powerful Delphi/ActiveX libraries for video playback, capture, and editing. Build professional media applications with our All-in-One Media Framework supporting Delphi 6 through 11 and beyond, with full x64 compatibility and ActiveX integration. sidebar_label: All-in-One Media Framework (Delphi/ActiveX) order: 18 icon: ../static/delphi.svg route: /docs/delphi/ --- # All-in-One Media Framework A set of Delphi/ActiveX libraries for video processing, playback, and capture called All-in-One Media Framework. These libraries help developers create professional video editing, playback, and capture applications with minimal effort and maximum performance. The framework provides a comprehensive solution for media handling in Delphi applications, offering high-performance video processing capabilities that would otherwise require extensive low-level programming. Developers can implement complex video workflows with simple component-based architecture. You can find the following library documentation here: ## Libraries - [TVFMediaPlayer](mediaplayer/index.md) - Full-featured media player component with playlist support, frame-accurate seeking, and advanced playback controls - [TVFVideoCapture](videocapture/index.md) - Powerful video capture component supporting webcams, capture cards, IP cameras, and screen recording - [TVFVideoEdit](videoedit/index.md) - Professional video editing component with timeline support, transitions, filters, and output to multiple formats ## Implementation Examples The framework includes numerous examples demonstrating how to implement common media tasks: - Video players with custom controls and visualizations - Multi-camera recording applications - Video editing software with timeline support - Format conversion utilities - Streaming media applications ## General Information ActiveX packages can be used in multiple programming languages and development environments including Visual C++, Visual Basic, and C++ Builder. These components extend your software capabilities, accelerating development and improving performance. With ActiveX integration, you can incorporate existing software components into your projects, boosting efficiency and functionality. Our framework is compatible with all Delphi versions from Delphi 6 to Delphi 11 and beyond, making it suitable for both legacy projects and new development. The components maintain a consistent API across different Delphi versions, simplifying migration between different IDE versions. ## Technical Specifications - **Supported Media Formats**: MP4, AVI, MOV, MKV, MPEG, WMV and many others - **Audio Support**: AAC, MP3, PCM, WMA and other popular audio codecs - **Video Codecs**: H.264, H.265/HEVC, MPEG-4, VP9, AV1 and more - **Capture Sources**: Webcams, HDMI capture cards, IP cameras, screen capture - **Hardware Acceleration**: NVIDIA NVENC, Intel Quick Sync, AMD AMF ## x64 Support Limitations With Delphi XE2 and later, you can develop 64-bit applications. Our framework fully supports these 64-bit applications, allowing you to leverage modern computing power and handle larger memory requirements. 64-bit support enables processing of higher resolution videos and more complex editing operations that would be impossible in 32-bit environments. Microsoft Visual Basic 6 does not support 64-bit applications. If you're using Visual Basic 6, you'll need to use the 32-bit version of our framework due to VB6's inherent limitations. While 32-bit applications can access up to 4GB of memory with proper configuration, for demanding video applications, we recommend using Delphi or other development environments with 64-bit support. ## Development Best Practices When integrating the framework into your applications, consider these best practices: - Initialize components at design time when possible for better IDE integration - Use hardware acceleration for demanding operations like encoding and decoding - Implement proper error handling for media operations - Consider memory management for large media files - Test with various media sources to ensure compatibility --- For more information about the framework, visit the [All-in-One Media Framework (Delphi/ActiveX)](https://www.visioforge.com/all-in-one-media-framework) product page. ---END OF PAGE--- # Local File: .\delphi\general\index.md --- title: Delphi Libraries for Professional Multimedia Dev description: Discover powerful Delphi/ActiveX libraries for building advanced multimedia applications. Our components enable developers to create high-performance video capture, media playback, and editing solutions with minimal coding effort. Browse documentation, examples, and troubleshooting guides. sidebar_label: General information --- # Delphi/ActiveX Libraries for Multimedia Development Welcome to our developer documentation hub for Delphi/ActiveX multimedia libraries. This resource provides in-depth technical information, code examples, and implementation guides for developers working with our specialized components. ## Core Library Benefits Our libraries empower Delphi developers to create sophisticated multimedia applications with minimal coding effort. The components are engineered for maximum performance and reliability in professional development environments. Key advantages include: - Simplified implementation of complex multimedia features - Optimized performance for resource-intensive operations - Cross-version compatibility with multiple Delphi releases - Extensive customization options for specialized requirements ## Documentation Organization ### Technical Reference Materials Each library section contains detailed API references, implementation examples, and recommended practices. Navigate to the specific library documentation for complete information about: - Component properties and attributes - Method signatures and parameters - Event handlers and callback functions - Type definitions and constants ### Code Examples and Tutorials Our documentation includes practical code snippets and complete implementation examples to accelerate your development process. These examples demonstrate effective techniques for common multimedia programming scenarios. ## Installation Troubleshooting Guide When integrating our libraries into your development environment, you might encounter these known technical issues: ### 64-bit Architecture Compatibility Delphi's 64-bit compilation environment requires special configuration in some cases: - [Resolving Delphi 64-bit package installation problems](install-64bit.md) - Handling memory alignment requirements in 64-bit environments - Addressing pointer size differences between architectures ### Resource File Management Proper resource handling is essential for stable operation: - [Fixing Delphi package installation issues with .otares files](install-otares.md) - Resolving resource locking during development - Managing resource file paths in deployed applications ## Getting Started To begin implementing our libraries in your projects, follow the library-specific installation guides and review the sample applications. Our documentation provides step-by-step instructions to help you achieve optimal results. ---END OF PAGE--- # Local File: .\delphi\general\install-64bit.md --- title: Delphi 64-bit Package Installation Guide description: Master Delphi 64-bit package installation and overcome common challenges with our detailed walkthrough. Learn how to properly configure library paths, manage runtime packages, and ensure seamless compatibility in your Delphi development projects. --- # Mastering Delphi 64-bit Package Installation ## Introduction to 64-bit Development in Delphi The evolution to 64-bit computing represents a significant advancement for Delphi developers, opening doors to enhanced performance, expanded memory addressing capabilities, and improved resource utilization. Since the introduction of 64-bit support in Delphi XE2, developers have gained the powerful ability to compile native 64-bit Windows applications. This capability enables software to harness modern hardware architectures, access substantially larger memory spaces, and deliver optimized performance for data-intensive operations. However, this technological progression introduces a distinctive set of complexities, particularly regarding the installation and management of component packages (`.bpl` files). Many Delphi developers encounter perplexing obstacles when attempting to integrate 64-bit packages into their development workflow, leading to frustration and lost productivity. This in-depth guide explores these challenges thoroughly and provides meticulously detailed, actionable solutions. The fundamental issue originates from a critical architectural characteristic: **the Delphi Integrated Development Environment (IDE) remains a 32-bit application**, even in the most recent releases. This architectural discrepancy between the 32-bit IDE and the 64-bit compilation target creates numerous misunderstandings and technical difficulties related to package management. Understanding this architectural limitation constitutes the essential first step toward establishing a seamless development experience. We will thoroughly examine why the 32-bit IDE requires 32-bit design-time packages, explore proper project configuration techniques for both 32-bit and 64-bit targets, clarify the critical function of runtime packages, and outline extensive testing methodologies to ensure your applications perform flawlessly across both architectural environments. ## The Architectural Limitation: Why the 32-bit IDE Requires 32-bit Design-Time Packages ### Understanding the IDE's Architecture The Delphi IDE serves as the principal environment for visual component design, code editing, debugging operations, and comprehensive project management. When designers place components onto forms using the Form Designer, modify properties through the Object Inspector, or utilize specialized component editors, the IDE must load and execute code contained within the component's design-time package. Because `bds.exe` (the Delphi IDE executable) operates as a 32-bit process, it functions exclusively within the 32-bit memory address space and must adhere to the constraints of 32-bit execution environments. The IDE physically cannot load or execute 64-bit code directly—this represents a hardware and operating system limitation, not merely a software restriction. Any attempt to load a 64-bit DLL (or in Delphi terminology, a 64-bit `.bpl` package) into a 32-bit process will result in immediate failure, typically manifesting as error messages like "Can't load package %s" or obscure operating system error codes. ### Critical Design-Time Requirements For the IDE to function properly during design activities—enabling visual component manipulation, property configuration, and utilization of design-time features—it *must* load the **32-bit (x86)** version of component packages. This requirement is non-negotiable due to the fundamental architecture of the IDE and operating system memory management principles. This architectural limitation frequently leads to confusion among developers, creating misconceptions that only 32-bit packages are necessary, or generating questions about why separate 64-bit packages exist if the IDE cannot utilize them. The critical distinction lies in understanding the separation between **design time** operations (occurring within the 32-bit IDE) and **compile/run time** processes (where applications can target either 32-bit or 64-bit architectures). ## Step-by-Step Implementation: Installing 32-bit Design-Time Packages ### Essential First Step: Installing 32-bit Components Based on the architectural explanation above, the mandatory initial step always involves installing the 32-bit version of component packages into the Delphi IDE. This process establishes the foundation for all subsequent development activities. 1. **Acquire Necessary Package Files:** Ensure you possess both 32-bit and 64-bit compiled package files (`.bpl` and `.dcp`). The 32-bit files typically carry identifier suffixes such as `_x86`, `_Win32`, or may lack platform specifiers in older Delphi versions. Conversely, 64-bit packages normally include `_x64` or `_Win64` designations. These files typically generate automatically when building component library projects targeting both Win32 and Win64 platforms. When using third-party components, reputable vendors should supply both architectural versions. 2. **Launch Development Environment:** Start the Delphi IDE with appropriate user permissions. 3. **Access Package Installation Interface:** Navigate through the menu system to `Component > Install Packages...`. 4. **Initiate Package Addition:** Click the "Add..." button to begin the installation process. 5. **Locate 32-bit Package Files:** Browse to the directory containing your **32-bit** compiled package files (`.bpl`). Carefully select the 32-bit `.bpl` file and click "Open" to proceed. 6. **Complete Installation Process:** The package should appear in the "Design packages" list, typically enabled by default. Confirm the installation by clicking "OK". ### Verification and Troubleshooting The IDE will attempt to load the 32-bit package. When successful, your components should appear in the Tool Palette, enabling immediate use in the Form Designer. If the IDE fails to load the package, verify that you selected the correct 32-bit `.bpl` file and ensure that all dependency packages required by your target package are properly installed and accessible. **Critical Warning:** Never attempt to install 64-bit `.bpl` files using the `Component > Install Packages...` menu option. Such attempts will invariably fail because the 32-bit IDE architecture cannot load 64-bit code modules. ## Advanced Configuration: Setting Project Library Paths for Dual Platform Development ### Configuring Compiler Search Paths While the IDE utilizes 32-bit packages during design-time operations, the Delphi compiler requires precise information about where to locate appropriate files (`.dcu`, `.dcp`, `.obj`) for your specific target platform during compilation (either 32-bit or 64-bit). These settings are configured through project options, specifically within the library path configuration section. Importantly, these settings must be established separately for each target platform. 1. **Access Project Configuration:** Navigate to `Project > Options...` in the IDE menu. 2. **Select Appropriate Platform:** It is absolutely crucial to configure paths separately for each target platform. Utilize the "Target Platform" dropdown menu located at the top of the Project Options dialog. Begin configuration with the "32-bit Windows" selection. 3. **Navigate to Library Configuration Section:** In the options tree displayed on the left side, select `Delphi Compiler > Library` to access path settings. 4. **Configure 32-bit Library Paths:** Within the "Library path" field, click the ellipsis (...) button to open the path editor. Add the directory containing your compiled **32-bit** units (`.dcu` files) and the **32-bit** package's `.dcp` file for the components you've installed. Ensure this path specifically references the 32-bit output directory of your component library. 5. **Switch to 64-bit Configuration:** Change the "Target Platform" dropdown selection to "64-bit Windows". Notice that the "Library path" field might display different content or appear empty. 6. **Configure 64-bit Library Paths:** Repeat the previous path configuration process, but this time add directories containing your compiled **64-bit** units (`.dcu` files) and the **64-bit** package's `.dcp` file. This path *must* differ from the 32-bit path and correctly reference the 64-bit output directory. 7. **Review Additional Path Settings:** While the Library path configuration is essential for locating `.dcu` and `.dcp` files, also examine the `Browsing path` settings (used by code insight features) and verify the `DCP output directory` location is properly configured if you are building packages yourself. Configure these paths for both 32-bit and 64-bit platforms as well. 8. **Save Configuration Changes:** Click "OK" to preserve the project options settings. ### Avoiding Common Configuration Errors **Frequent Mistake:** Many developers forget to switch the "Target Platform" dropdown *before* setting the path for that platform. Configuring the 64-bit path while "32-bit Windows" remains selected (or vice-versa) represents a common source of compilation errors later in the development process. By correctly establishing these platform-specific library paths, you provide the compiler with precise information about where to locate necessary `.dcu` and `.dcp` files for the architecture currently under construction. ## Runtime Package Management Strategies ### Deciding on Linking Approaches Beyond instructing the compiler where to find units during compilation, you must determine how your final executable will link against component libraries. This critical decision is controlled through the "Runtime Packages" settings section. You have two principal options: 1. **Static Linking Approach:** If you leave the "Link with runtime packages" option unchecked (or remove all packages from the list), the compiler will directly incorporate necessary code and resources from your components into the final `.exe` file. This approach produces larger executable files but eliminates the requirement to distribute separate `.bpl` files alongside your application. 2. **Dynamic Linking (Runtime Packages) Approach:** If you enable "Link with runtime packages" and specify required packages, the compiler will *not* embed component code into your `.exe`. Instead, your application will dynamically load necessary `.bpl` files during execution. This strategy creates smaller executable files but requires deploying corresponding 32-bit or 64-bit `.bpl` files with your application distribution. ### Detailed Configuration Process 1. **Access Project Options:** Navigate to `Project > Options...` in the IDE menu. 2. **Select Target Platform:** Choose either "32-bit Windows" or "64-bit Windows" from the platform dropdown. 3. **Navigate to Package Settings:** Select `Packages > Runtime Packages` in the options navigation tree. 4. **Configure Linking Method:** Enable or disable the "Link with runtime packages" option based on your preferred linking approach determined earlier. 5. **Specify Required Packages:** When utilizing runtime packages, ensure the list contains the correct base names of packages your application requires (e.g., `MyComponentPackage`). Do *not* include platform suffixes or file extensions in these entries. Delphi automatically appends appropriate platform identifiers and loads the correct `_x86.bpl` or `_x64.bpl` files (or equivalent naming based on Delphi version/settings) during runtime. 6. **Configure Secondary Platform:** Switch the "Target Platform" selection and configure runtime package settings identically for the alternative platform. Typically, the decision to use or not use runtime packages remains consistent across both platforms, but package lists might differ if utilizing platform-specific libraries. 7. **Preserve Configuration:** Click "OK" to save the settings. ### Deployment Considerations **Critical Deployment Requirement:** If you choose dynamic linking with runtime packages, remember that you *must* distribute the correct architectural version (32-bit or 64-bit) of those `.bpl` files with your application. The 32-bit executable requires 32-bit `.bpl` files, while the 64-bit executable needs 64-bit `.bpl` files. Place these files either in the same directory as the `.exe` or in locations accessible through the system's PATH environment variable. ## Comprehensive Testing and Verification Methodologies ### Multi-platform Verification Configuration alone cannot guarantee success. Thorough testing becomes essential to confirm that everything functions as expected across both target platforms. 1. **Multi-platform Compilation:** Build your project explicitly for both "32-bit Windows" and "64-bit Windows" target platforms. Address any compiler errors that emerge during this process. Errors occurring during compilation frequently indicate incorrectly configured library paths (detailed in Step 2). 2. **32-bit Execution Testing:** Execute the compiled 32-bit application. Thoroughly test all functionality that depends on the components in question. Specifically look for: * Proper visual appearance and interactive behavior of components. * Absence of exceptions during component instantiation or method invocation. * If using runtime packages, verify the application launches without "Package XYZ not found" error messages. 3. **64-bit Execution Testing:** Execute the compiled 64-bit application. Perform identical tests as conducted with the 32-bit version. Pay particular attention to: * Any behavioral differences compared to the 32-bit version. * Runtime errors such as Access Violations, which might indicate underlying 64-bit compatibility issues in the component code or application logic (e.g., incorrect pointer arithmetic, integer size assumptions). * For runtime packages, check again for missing package errors, ensuring 64-bit `.bpl` files are properly accessible. 4. **Edge Case Evaluation:** Include testing scenarios that explore boundary conditions, particularly regarding memory usage if that represents a motivation for transitioning to 64-bit. Load extensive datasets and perform complex operations involving the components to stress-test the implementation. ### Interpreting Test Results Any discrepancies or errors encountered during runtime on one platform but not the other strongly suggest either a problem in package configuration (Steps 2 or 3) or potential 64-bit compatibility issues within the component or application code itself. Such issues require careful diagnosis and targeted resolution. ## Advanced Troubleshooting Guide ### Resolving Common Installation Issues * **"Package XYZ.bpl can't be installed because it is not a design time package."**: This error typically indicates an attempt to install a package via `Component > Install Packages` that lacks necessary design-time registrations or configuration flags. Verify that the package project is correctly configured as a design-time package or combined design-time & runtime package. * **"Can't load package XYZ.bpl. %1 is not a valid Windows application." / "The specified module could not be found."**: This almost certainly indicates an attempt to install a **64-bit** BPL into the 32-bit IDE via `Component > Install Packages`. Remember to install only 32-bit BPL files through this interface. The "module not found" variant may also occur if the package has dependencies that aren't properly installed or cannot be located. * **[Compiler Error] F1026 File not found: 'ComponentUnit.dcu'**: This error occurs during compilation (not at design time). It indicates the compiler cannot locate the required `.dcu` file for the currently selected target platform. Carefully review your `Project Options > Delphi Compiler > Library > Library path` settings for the *specific platform* you are currently compiling (Step 2). Ensure the path correctly references the appropriate directory (32-bit or 64-bit) containing the necessary `.dcu` files. * **[Linker Error] E2202 Required package 'XYZ' not found**: Similar to F1026, but occurring during the linking phase. This frequently indicates the `.dcp` file for the package cannot be found. Verify the Library Path (Step 2) includes the directory containing the correct platform's `.dcp` file. Additionally, ensure the package name appears correctly in `Project Options > Packages > Runtime Packages` if utilizing dynamic linking (Step 3). * **Runtime Error: "Package XYZ not found"**: This indicates your application was compiled to use runtime packages, but the required `.bpl` file (matching the application's architecture) cannot be located during application startup. Ensure the correct 32-bit or 64-bit `.bpl` files are deployed alongside your `.exe` file (as described in Step 3). * **Runtime Access Violations (AVs) only in 64-bit:** This typically indicates 64-bit compatibility issues in the code (either in your application or the component implementation). Common sources include: * Pointer arithmetic assuming `SizeOf(Pointer)=4` (valid only in 32-bit code). * Incorrect use of `Integer` instead of `NativeInt`/`NativeUInt` for handles or pointer-sized values. * Direct calls to Windows API functions using incorrect data types for 64-bit environments. * Data structure alignment issues. Debugging the 64-bit application becomes necessary to identify the specific cause of these violations. ## Working with Third-Party Component Packages ### Best Practices for External Components The principles outlined throughout this guide apply equally to third-party components. Reputable component vendors typically provide: 1. Detailed instructions for proper installation procedures. 2. Separate 32-bit and 64-bit compiled `.bpl`, `.dcp`, and `.dcu` files. 3. An installation utility that handles file placement in appropriate locations and potentially automates the installation of 32-bit design-time packages into the IDE. If an installer is provided, utilize it as your first approach. However, always validate project options (Library Paths, Runtime Packages) afterward, as installers may not perfectly configure paths for every possible project configuration or Delphi version. If you receive only raw library files without an installer, follow Steps 1-3 manually, carefully identifying and configuring paths for both 32-bit and 64-bit versions supplied by the vendor. When encountering issues, consult the vendor's documentation or contact their technical support team for assistance. ## Summary and Recommendations ### Key Implementation Strategies Successfully managing Delphi packages for both 32-bit and 64-bit development fundamentally depends on understanding the 32-bit nature of the IDE and meticulously configuring project options for each target platform independently. Always install the 32-bit package for design-time use, then carefully establish platform-specific Library Paths and Runtime Package settings to ensure the compiler and your final application can locate and utilize the correct files for the target architecture. While this approach introduces additional complexity compared to purely 32-bit development, the structured methodology enables you to leverage the substantial benefits of 64-bit compilation while maintaining a fully functional design-time experience within the familiar Delphi IDE environment. Consistent testing across both platforms represents the final, crucial verification step to guarantee robust, reliable applications that perform optimally in both 32-bit and 64-bit environments. --- Need additional information? Please [contact support](https://support.visioforge.com/) for assistance with specific scenarios or component issues. ---END OF PAGE--- # Local File: .\delphi\general\install-otares.md --- title: Fixing .otares File Errors in Delphi Packages description: Step-by-step solutions for resolving missing .otares file errors when installing Delphi packages. Learn how to troubleshoot resource file issues, fix package compilation errors, and implement practical solutions for Delphi developers facing resource file problems. sidebar_label: Fixing .otares errors in Delphi --- # Fixing .otares File Errors in Delphi Packages ## How to Solve the .otares File Not Found Error in Delphi When working with Delphi packages, developers frequently encounter the frustrating .otares file not found error that can completely halt your development workflow. This practical guide explains the problem, identifies common causes, and provides tested solutions to get your projects back on track. ### What is an .otares File? To effectively troubleshoot this issue, you need to understand the role of .otares files in Delphi: - Resource files specific to Delphi development environments - Contain compiled resources including images, icons, and binary assets - Generated during package compilation processes - Critical for packages with visual components or resource-dependent features ### Typical Error Messages You'll likely encounter these errors during compilation or installation: ```cs [dcc32 Error] E1026 File not found: 'Package_Name.otares' [dcc32 Error] E1026 Could not locate resource file 'Component_Package.otares' [dcc32 Error] Package compilation failed due to missing .otares file ``` ### When This Issue Typically Occurs These errors commonly appear when: 1. Installing third-party component packages 2. Upgrading to newer Delphi versions 3. Moving projects between development machines 4. Collaborating with team members on shared projects ### Why .otares File Errors Happen Several factors can trigger these errors: 1. **Missing Resource Files**: The .otares file isn't in the expected location 2. **Incorrect Path References**: Package configuration references wrong location 3. **Version Compatibility Issues**: Resource file compiled for different Delphi version 4. **Corrupted Resources**: The file exists but is damaged 5. **Permission Problems**: Environment lacks access rights to the resource location ### Step-by-Step Solution Guide Follow these practical steps to resolve .otares-related issues: 1. **Find and Examine the .dpk File** - Navigate to your package's source directory - Open the .dpk file in Delphi IDE or text editor - Review all resource references - Focus on `$R` directives 2. **Identify Problematic Resource Directives** - Search for lines starting with `$R` or `{$R}` - These lines specify resource file inclusions - Example of problematic directives: ```pascal {$R 'Component_Package.otares'} {$R '.\resources\ComponentResources.otares'} ``` 3. **Apply the Fix** **Comment out the problematic resource reference:** ```pascal // Original line {$R 'Component_Package.otares'} // Modified version // {$R 'Component_Package.otares'} ``` 4. **Rebuild the Package** - Save all changes to the .dpk file - Restart the Delphi IDE to ensure changes are recognized - Clean the project (Project → Clean) - Rebuild the package (Project → Build) - If successful, install the package ### Advanced Solutions for Persistent Issues When basic fixes don't work, try these advanced approaches: 1. **Recreate Resource Files** - Locate the original source files - Use Resource Compiler to rebuild the .otares file - Update package references to the new file 2. **Check Package Dependencies** - Look for circular dependencies - Verify installation order is correct - Ensure version compatibility 3. **Verify Environment Configuration** - Check BDSCOMMONDIR setting - Verify PATH variables for resource locations - Confirm library paths in IDE options --- For personalized assistance with this issue, [contact our support team](https://support.visioforge.com/) and our technical experts will guide you through resolving your specific package installation problems. ---END OF PAGE--- # Local File: .\delphi\mediaplayer\changelog.md --- title: Media Player Library Updates and Features description: Comprehensive documentation of media player enhancements, including 4K support, encryption, video effects, streaming capabilities, and performance optimizations. Track the evolution of features from version 3.0 to the latest 10.0 release. sidebar_label: Changelog --- # TVFMediaPlayer Library Changelog This document details the evolution of the TVFMediaPlayer library, chronicling the significant features, enhancements, optimizations, and bug fixes introduced across various versions. It serves as a comprehensive reference for developers tracking the library's progress and understanding the capabilities added over time. ## Version 10.0: Enhanced Media Handling and Customization Version 10.0 represents a significant step forward, focusing on improved media introspection, logging, customization, and compatibility. ### Core Enhancements * **Enhanced Media Information Reader:** This version significantly boosts the capabilities of the media information reader. It enables faster, more accurate extraction of metadata from an extensive array of media file types. Developers gain reliable access to critical details like duration, resolution, codec specifics, bitrates, and embedded tags, which streamlines media management and enhances the display capabilities within applications. * **Improved Logging Capabilities:** Logging has been substantially refined, offering developers more granular control. Configuration options now include distinct log levels (Debug, Info, Warning, Error) and flexible output destinations such as files, the console, or custom endpoints. This facilitates more effective issue diagnosis during development and robust monitoring of application behavior in production, ultimately leading to quicker troubleshooting and increased application stability. * **Standard Metadata Tag Support:** A cornerstone of this release is the introduction of comprehensive support for reading standard metadata tags embedded within popular video and audio containers. This includes formats like MP4, WMV, MP3, AAC, M4A, and Ogg Vorbis. Applications utilizing TVFMediaPlayer can now seamlessly extract and leverage common tags such as title, artist, album, genre, year, and cover art, thereby enriching the user experience by providing valuable context for the media being played. ### Capture and Effects Improvements * **Configurable Auto-Split Filenames:** The new `SeparateCapture_Filename_Mask` property provides fine-grained control over filenames when using the auto-split capture feature based on duration or size. This allows for customized naming conventions, improving organization and workflow for segmented recordings. * **JSON Settings Serialization:** Configuration settings for the media player can now be easily serialized to and deserialized from the widely-used JSON format. This simplifies saving and loading player configurations, enabling persistent settings and easier integration with configuration management systems. * **Custom Video Effects Pipeline:** Flexibility in video processing is enhanced with the ability to insert custom video effects using third-party filters identified by their CLSID. These filters can be strategically placed either before or after the main effects filter or sample grabber, allowing for sophisticated, tailored video manipulation pipelines. * **Optimized Video Effects:** Video effects processing has been optimized to take full advantage of the latest generations of Intel CPUs, resulting in smoother playback and lower resource consumption when applying effects. ### Source and Compatibility Fixes * **MP3 Splitter for Playback Issues:** An MP3 splitter has been integrated to specifically address and resolve playback inconsistencies encountered with certain non-standard or problematic MP3 files, ensuring broader compatibility. * **Updated VLC Source Filter:** The underlying VLC source filter has been updated to libVLC version 2.2.2.0. This update brings notable improvements, particularly in handling RTMP and HTTPS streams, and resolves previously identified memory leaks, contributing to enhanced stability and broader streaming protocol support. * **Pan and Blur Effect Fixes:** Specific issues related to the Pan effect in x64 builds and the Blur effect have been addressed and resolved, ensuring consistent visual effect behavior across different architectures. * **FFMPEG Source Memory Leak Resolved:** A memory leak associated with the FFMPEG source component has been identified and fixed, improving long-term stability and resource management during playback. ## Version 9.2: Engine Updates and Reader Enhancements This interim release focused on updating core components and further refining the media information capabilities. * **Updated VLC Engine:** The integrated VLC engine was updated to libVLC version 2.2.1.0, incorporating upstream fixes and improvements from the VLC project for better stability and format compatibility. * **Enhanced Media Information Reader:** Building upon previous improvements, the media information reader received further enhancements for broader file support and more accurate metadata extraction. * **Updated FFMPEG Engine:** The FFMPEG engine components were updated, ensuring compatibility with newer codecs and formats while incorporating performance optimizations. ## Version 9.1: Advanced Security Integration Version 9.1 introduced robust security features through integration with the Video Encryption SDK. * **Video Encryption SDK v9 Support:** This version added compatibility with the Video Encryption SDK v9. This enables developers to implement strong AES-256 encryption for their video content, using either separate key files or embedded binary data as keys, significantly enhancing content protection capabilities. ## Version 9.0: Audio Enhancements and Logo Flexibility Version 9.0 brought significant improvements to audio handling and visual branding options. * **Animated GIF Logo Support:** The capability to use image logos was expanded to include support for animated GIFs, allowing for more dynamic and engaging visual branding within the video playback interface. * **Audio Enhancements:** A suite of audio enhancement features was introduced, including audio normalization to ensure consistent volume levels, automatic gain control (AGC) to dynamically adjust volume, and manual gain controls for precise audio level adjustments. * **Percentage-Based Audio Volume:** The API for controlling audio volume was modernized to use a percentage-based system (0-100%), providing a more intuitive and standardized way to manage audio levels compared to previous methods. ## Version 8.6: Decoder Expansion and API Additions This release focused on expanding codec support, adding flexibility through custom filters, and refining the API. * **H264 CPU/Intel QuickSync Decoder:** A highly optimized H264 video decoder was added, leveraging both CPU resources and Intel QuickSync hardware acceleration where available. This significantly improves performance for decoding one of the most common video codecs. * **Custom DirectShow Video Filter Support:** Developers gained the ability to integrate their own custom DirectShow video filters into the playback graph, allowing for highly specialized video processing tasks. * **`OnNewFilePlaybackStarted` Event:** A new event, `OnNewFilePlaybackStarted`, was introduced. This event fires specifically when a new file begins playing within a playlist context, enabling applications to react precisely to transitions between media items. * **Updated Decoders:** The Ogg Vorbis audio decoder and WebM video decoders were updated to their latest versions, ensuring compatibility and performance improvements. * **Frame Grabber API Update:** The API for grabbing individual video frames was updated, potentially offering improved performance or flexibility. * **Bug Fixes:** Various unspecified bug fixes were implemented to improve overall stability and reliability. ## Version 8.5: Rotation, 4K Readiness, and Rendering Options Version 8.5 introduced innovative video manipulation features and prepared the engine for ultra-high-definition content. * **On-the-Fly Video Rotation:** A new video effect was added, enabling real-time rotation of the video stream during playback (e.g., 90, 180, 270 degrees). * **Updated FFMPEG Source:** The FFMPEG source component was updated, likely incorporating support for newer formats or improving performance. * **4K-Ready Video Effects:** Existing video effects were optimized and tested to ensure they perform efficiently with 4K resolution video content. * **VMR-9/EVR Zoom Shift Bug Fix:** A specific bug related to unexpected image shifting when using zoom with the VMR-9 or EVR video renderers was corrected. * **Direct2D Video Renderer (Beta):** A new video renderer based on Direct2D was introduced as a beta feature. This renderer included support for live video rotation and aimed to leverage modern graphics APIs for potentially improved performance and quality. * **Bug Fixes:** Included various general bug fixes to enhance stability. ## Version 8.4: Decoder Updates and Stability This was primarily a maintenance release focused on updating core components. * **Updated FFMPEG Decoder:** The FFMPEG decoder components were updated, likely incorporating fixes and improvements from the FFMPEG project. * **Bug Fixes:** Addressed various unspecified bugs for improved stability. ## Version 8.3: Stability Release This release focused solely on addressing bugs identified in previous versions. * **Bug Fixes:** Implemented various fixes to enhance the overall reliability and stability of the library. ## Version 8.0: Introducing the VLC Engine Version 8.0 marked a significant architectural addition by integrating the powerful VLC engine. * **VLC Engine Integration:** The renowned VLC engine was integrated as an alternative playback backend for video and audio files. This brought VLC's extensive format support and robust streaming capabilities to TVFMediaPlayer applications. * **Bug Fixes:** Included various general bug fixes. ## Version 7.x Series: Effects, Encryption, and Playlists The Version 7 series introduced key features related to playback control, security, and visual effects. ### Version 7.20 * **Reverse Playback:** Added the capability to play video files in reverse, opening up creative possibilities and specialized application use cases. * **Bug Fixes:** Addressed various bugs. ### Version 7.12 * **Video Encryption Support:** Initial support for video encryption was added, providing basic content protection mechanisms. * **Bug Fixes:** Included general stability improvements. ### Version 7.7 * **Fade-In/Fade-Out Effect:** A common and useful video transition effect, fade-in/fade-out, was added to the available video effects. * **Playlist Support:** Functionality for creating and managing playlists was introduced, allowing sequences of media files to be played automatically. * **Bug Fixes:** Addressed various issues. ### Version 7.5 * **Improved Chroma Key:** The chroma key (green screen) effect was enhanced for better quality and more precise control. * **Enhanced Text Logo:** The feature for overlaying text logos onto the video was improved. * **Modified Video Effects API:** The API for applying video effects underwent modifications, potentially for improved usability or to accommodate new features. * **Bug Fixes:** Included various stability fixes. ### Version 7.0 * **Windows 8 RTM Support:** Ensured compatibility with the release version of Windows 8. * **Enhanced Video Effects:** Further improvements were made to the quality and performance of existing video effects. * **New FFMPEG Playback Engine:** Introduced a new playback engine based on FFMPEG components, offering an alternative to the default DirectShow-based playback and expanding format compatibility. ## Version 6.x Series: Windows 8 Compatibility and Optimizations The Version 6 series focused on adapting to the then-new Windows 8 operating system and improving performance. ### Version 6.3 * **Windows 8 Customer Preview Support:** Added compatibility for the pre-release Customer Preview version of Windows 8. * **Improved Video Effects:** Continued refinement of video effect performance and quality. ### Version 6.0 * **Enhanced OpenCL Support:** Improved utilization of OpenCL for GPU acceleration tasks, potentially boosting performance for effects or decoding on compatible hardware. * **Windows 8 Developer Preview Support:** Added early support for the Developer Preview version of Windows 8. * **Improved Video Effects:** General enhancements to the video effects subsystem. ## Version 3.x Series: Early Features and Optimizations The Version 3 series laid groundwork features and focused on CPU-specific optimizations. ### Version 3.9 * **New Installers:** Introduced a new main installer and separate redistributable installers for easier deployment. * **Minor Bug Fixes:** Addressed minor outstanding issues. ### Version 3.7 * **Improved Video Effects:** Enhancements made to the video effects features. * **New Demo Applications:** Added new demo applications to showcase library capabilities. * **Netbook CPU Optimizations:** Included specific performance optimizations tailored for Intel Core II/Atom and AMD netbook processors. * **Minor Bug Fixes:** General stability improvements. ### Version 3.5 * **Improved Video Effects:** Continued work on enhancing video effects. * **Intel Core i7 Optimizations:** Added new performance optimizations specifically for the then-new Intel Core i7 CPU architecture. ### Version 3.0 * **Motion Detection:** Introduced a motion detection feature, enabling applications to react to changes within the video stream. * **Chroma Key:** Added initial chroma key (green screen) functionality. * **MMS/WMV Source Support:** Included support for streaming using the MMS protocol and playing WMV (Windows Media Video) files. * **CPU Optimizations:** Added performance optimizations targeted at Intel Atom and Core i3/i5/i7 processors. * **Direct Stream Processing:** Enabled the capability to directly access and process decoded video and audio stream data, offering advanced manipulation possibilities. ---END OF PAGE--- # Local File: .\delphi\mediaplayer\deployment.md --- title: Media Player Library Deployment for Delphi & ActiveX description: Comprehensive guide for deploying media player components in Delphi and ActiveX applications. Learn both automated and manual installation methods, including codec setup, DirectShow filters, and environment configuration. sidebar_label: Deployment Guide --- # Deployment Guide for TVFMediaPlayer Deploying applications built with the TVFMediaPlayer library requires ensuring that all necessary components are correctly installed and configured on the target machine. This guide provides detailed instructions for both automated and manual deployment methods, catering to different scenarios and technical requirements. Whether you prefer the simplicity of silent installers or the granular control of manual setup, this document covers the essential steps to successfully deploy your Delphi or ActiveX media player application. ## Understanding Deployment Requirements Before deploying your application, it's crucial to understand the dependencies of the TVFMediaPlayer library. The library relies on several core components, including base runtimes, specific codecs (like FFMPEG or VLC for certain sources), and Microsoft Visual C++ Redistributables. The deployment method you choose will determine how these dependencies are handled. ### Core Components * **Base Library:** Contains the essential engine and DirectShow filters for basic playback functionality. * **Codec Packages:** Optional but often necessary for supporting a wide range of media formats and network streams (e.g., IP cameras). FFMPEG and VLC are common choices provided. * **Runtime Dependencies:** Microsoft Visual C++ Redistributable packages are required for the core library components to function correctly. Choosing the right deployment strategy depends on factors like user privileges on the target machine, the need for unattended installation, and the specific features of your application (e.g., which media sources it needs to support). ## Method 1: Automated Installation (Admin Rights Required) Using the provided silent installers is the most straightforward method for deploying the TVFMediaPlayer library components. These installers handle the registration of necessary files and ensure all dependencies are correctly placed. This method requires administrative privileges on the target machine as it involves system-level changes like registering COM components and potentially modifying the system PATH. ### Available Installers VisioForge provides separate installers for the base library and optional codec packages, with versions for both Delphi and ActiveX, and for x86 and x64 architectures. #### Base Package (Mandatory) This package installs the core TVFMediaPlayer components and essential DirectShow filters. It's always required, regardless of the media sources your application uses. Choose the installer corresponding to your development environment (Delphi or ActiveX) and target architecture (x86 or x64). * **Delphi:** * [x86 Installer](http://files.visioforge.com/redists_delphi/redist_media_player_base_delphi.exe) * [x64 Installer](http://files.visioforge.com/redists_delphi/redist_media_player_base_delphi_x64.exe) * **ActiveX:** * [x86 Installer](http://files.visioforge.com/redists_delphi/redist_media_player_base_ax.exe) * [x64 Installer](http://files.visioforge.com/redists_delphi/redist_media_player_base_ax_x64.exe) #### FFMPEG Package (Optional - For File/IP Camera Sources) If your application needs to play local files or stream from IP cameras using the FFMPEG engine, you must deploy this package. FFMPEG provides a wide range of codec support. * **FFMPEG:** * [x86 Installer](http://files.visioforge.com/redists_delphi/redist_media_player_ffmpeg.exe) * *Note: An x64 FFMPEG installer link was not explicitly provided in the original source; assume x86 covers most needs or consult VisioForge documentation for x64 specifics if required.* #### VLC Source Package (Optional - For File/IP Camera Sources) As an alternative or addition to FFMPEG, you can use the VLC engine for file and IP camera sources. This requires deploying the VLC package. Ensure you select the correct architecture. * **VLC:** * [x86 Installer](https://files.visioforge.com/redists_net/redist_dotnet_vlc_x86.exe) * [x64 Installer](https://files.visioforge.com/redists_net/redist_dotnet_vlc_x64.exe) ### Installer Usage These installers are designed for silent execution, making them suitable for inclusion in larger application setup routines or for deployment via scripts. Run the executable(s) with administrator privileges on the target machine. ```bash # Example: Running the base Delphi x86 installer silently redist_media_player_base_delphi.exe /S ``` *(Note: The exact silent switch might vary; consult the installer documentation or use standard switches like `/S`, `/silent`, or `/q` if `/S` doesn't work).* ## Method 2: Manual Installation (Admin Rights Recommended) Manual installation offers more control but requires careful execution of each step. This method is suitable when automated installers cannot be used, or when deploying to environments with specific restrictions. While some steps might be achievable without full admin rights, registering COM components typically requires elevation. ### Prerequisites Before copying library files, ensure the necessary runtime dependencies are present on the target system. #### Install VC++ 2010 SP1 Redistributable The TVFMediaPlayer library relies on the Microsoft Visual C++ 2010 SP1 runtime. Install the appropriate version (x86 or x64) for your application's target architecture. * **VC++ 2010 SP1:** * [x86 Redistributable](http://files.visioforge.com/shared/vcredist_2010_x86.exe) * [x64 Redistributable](http://files.visioforge.com/shared/vcredist_2010_x64.exe) Run these installers before proceeding with the library file deployment. ### Deploying Core Library Files Follow these steps to manually install the base library components: 1. **Copy Core DLLs:** Locate the `Redist\Filters` folder within your TVFMediaPlayer installation directory. Copy all the DLL files from this folder to a deployment directory on the target machine. A common practice is to place these DLLs in the same folder as your application's executable. 2. **Register DirectShow Filters:** The core functionality relies on several DirectShow filters (`.ax` files). These must be registered with the Windows operating system using Component Object Model (COM) registration. * **Identify Filters:** The key filters to register are: * `VisioForge_Audio_Effects_4.ax` * `VisioForge_Dump.ax` * `VisioForge_RGB2YUV.ax` * `VisioForge_Video_Effects_Pro.ax` * `VisioForge_YUV2RGB.ax` * *(Note: Other `.ax` files might be present; register all `.ax` files found in the `Redist\Filters` directory).* * **Registration Method:** Use the `regsvr32.exe` command-line tool, which is part of Windows. Open an Command Prompt **as Administrator** and run the command for each `.ax` file. ```bash # Example: Registering a filter (run from the directory containing the .ax file) regsvr32.exe VisioForge_Video_Effects_Pro.ax ``` Alternatively, VisioForge provides a utility `reg_special.exe` in the redistributables. Copy this utility to the folder containing the `.ax` files and run it with administrator privileges to register all filters in that directory automatically. Refer to Microsoft's documentation for troubleshooting `regsvr32.exe` errors: [How to use the Regsvr32 tool](https://support.microsoft.com/en-us/help/249873/how-to-use-the-regsvr32-tool-and-troubleshoot-regsvr32-error-messages). 3. **Update System PATH (Optional but Recommended):** If the filter DLLs and `.ax` files are placed in a directory separate from your application's executable, you must add the path to this directory to the system's `PATH` environment variable. This allows the operating system and your application to locate these essential files. Failure to do this can result in "DLL not found" or filter registration errors. ### Deploying Optional Packages Manually #### FFMPEG Deployment 1. **Copy Files:** Copy the entire contents of the `Redist\FFMPEG` folder from your TVFMediaPlayer installation to a deployment directory on the target machine (e.g., a subfolder within your application's installation directory). 2. **Update System PATH:** Add the full path to the folder where you copied the FFMPEG files to the Windows system `PATH` environment variable. This is crucial for the library to find and load the FFMPEG components. #### VLC Deployment (Example: x86) 1. **Copy Files:** Copy the entire contents of the `Redist\VLC` folder (specifically the x86 version if applicable) to a deployment directory. 2. **Register VLC Filter:** Locate the `.ax` file within the copied VLC files (e.g., `axvlc.dll` or similar, though the original text only generically mentions ".ax file") and register it using `regsvr32.exe` with administrator privileges. 3. **Set Environment Variable:** Create a new system environment variable named `VLC_PLUGIN_PATH`. Set its value to the full path of the `plugins` subfolder within the directory where you copied the VLC files (e.g., `C:\YourApp\VLC\plugins`). This tells the VLC engine where to find its necessary plugin modules. ## Verification and Troubleshooting After deployment, thoroughly test your application on the target machine. * Check basic playback functionality. * Test any specific features that rely on optional packages (FFMPEG or VLC), such as playing various file formats or connecting to IP cameras. * If errors occur, double-check: * Admin rights during installation/registration. * Correct installation of VC++ Redistributables. * Successful registration of all `.ax` files (check `regsvr32.exe` output). * Accurate configuration of `PATH` and `VLC_PLUGIN_PATH` environment variables. * Correct architecture (x86/x64) match between your application, the library components, and runtime dependencies. --- Need further assistance? Contact [VisioForge Support](https://support.visioforge.com/). Explore more examples on our [GitHub](https://github.com/visioforge/). ---END OF PAGE--- # Local File: .\delphi\mediaplayer\file-multiple-video-streams.md --- title: Play Multiple Video Streams from Single File description: Learn how to handle and play video files containing multiple video streams, including different camera angles and resolutions. Includes code examples for Delphi, C++, and VB6 with detailed implementation steps and best practices. sidebar_label: How do I play a video file with multiple video streams? --- # Playing Video Files with Multiple Video Streams ## Understanding Multiple Video Streams ### What Are Multiple Video Streams? Multiple video streams refer to different video tracks contained within a single media file. These streams can vary in several ways: - Different camera angles of the same scene - Alternate versions with varying resolutions or bitrates - Primary and secondary content (such as picture-in-picture) - Different aspect ratios or formats of the same content - Versions with or without special effects or graphics ### Supported File Formats Many popular container formats support multiple video streams, including: - **Matroska (MKV)**: Widely recognized for its flexibility and robust support for multiple streams - **MP4/MPEG-4**: Common in both professional and consumer applications - **AVI**: Although older, still widely used in some contexts - **WebM**: Popular for web-based applications - **TS/MTS**: Used in broadcast applications and consumer video cameras Each format has its own characteristics and limitations regarding how it handles multiple video streams, but the `TVFMediaPlayer` component provides a unified approach to working with them. ## Implementing Multiple Video Stream Playback ### Setting Up the Media Player The first step is to properly initialize the `TVFMediaPlayer` object. This involves creating the instance, configuring basic properties, and preparing it for playback: ```pascal // Define and create the MediaPlayer object var MediaPlayer1: TVFMediaPlayer; begin MediaPlayer1 := TVFMediaPlayer.Create(Self); // Set container size and position if needed MediaPlayer1.Parent := Panel1; // Assuming Panel1 is your container MediaPlayer1.Align := alClient; // Configure initial state MediaPlayer1.DoubleBuffered := True; MediaPlayer1.AutoPlay := False; // We'll control playback explicitly ``` ### Configuring the Media Source Next, we need to specify the media file and configure how it should be loaded: ```pascal // Set the file name - use full path for reliability MediaPlayer1.FilenameOrURL := 'C:\Videos\multistream-video.mkv'; // Enable audio playback (default DirectSound audio renderer will be used) MediaPlayer1.Audio_Play := True; // Configure audio settings if needed MediaPlayer1.Audio_Volume := 85; // Set volume to 85% // Set the source mode to DirectShow // Other options include SM_File_FFMPEG or SM_File_VLC MediaPlayer1.Source_Mode := SM_File_DS; ``` ### Selecting and Switching Video Streams The key to working with multiple video streams is the `Source_VideoStreamIndex` property. This zero-based index allows you to select which video stream should be rendered: ```pascal // Set video stream index to 1 (second stream, as index is zero-based) MediaPlayer1.Source_VideoStreamIndex := 1; // Start playback MediaPlayer1.Play(); ``` ## C++ MFC Implementation ### Setting Up the Media Player Here's how to implement multiple video stream playback using C++ with MFC: ```cpp // In your header file (MyDlg.h) private: CVFMediaPlayer* m_pMediaPlayer; // In your implementation file (MyDlg.cpp) BOOL CMyDlg::OnInitDialog() { CDialog::OnInitDialog(); // Create the MediaPlayer instance m_pMediaPlayer = new CVFMediaPlayer(); // Initialize the control CWnd* pContainer = GetDlgItem(IDC_PLAYER_CONTAINER); // Your container control m_pMediaPlayer->Create(NULL, NULL, WS_CHILD | WS_VISIBLE, CRect(0, 0, 0, 0), pContainer, 1001); // Configure display settings m_pMediaPlayer->SetWindowPos(NULL, 0, 0, pContainer->GetClientRect().Width(), pContainer->GetClientRect().Height(), SWP_NOZORDER); m_pMediaPlayer->PutDoubleBuffered(TRUE); m_pMediaPlayer->PutAutoPlay(FALSE); return TRUE; } ``` ### Configuring the Media Source ```cpp void CMyDlg::PlayMultiStreamVideo() { // Set the file path and configure source m_pMediaPlayer->PutFilenameOrURL(_T("C:\\Videos\\multistream-video.mkv")); // Configure audio m_pMediaPlayer->PutAudio_Play(TRUE); m_pMediaPlayer->PutAudio_Volume(85); // Set source mode to DirectShow m_pMediaPlayer->PutSource_Mode(SM_File_DS); // Select the second video stream (index 1) m_pMediaPlayer->PutSource_VideoStreamIndex(1); // Start playback m_pMediaPlayer->Play(); } // Don't forget to clean up void CMyDlg::OnDestroy() { if (m_pMediaPlayer != NULL) { m_pMediaPlayer->DestroyWindow(); delete m_pMediaPlayer; m_pMediaPlayer = NULL; } CDialog::OnDestroy(); } ``` ## VB6 Implementation Here's how to implement multiple video stream playback in Visual Basic 6: ```vb ' Declare the MediaPlayer object at form level Private WithEvents MediaPlayer1 As TVFMediaPlayer Private Sub Form_Load() ' Create the MediaPlayer instance Set MediaPlayer1 = New TVFMediaPlayer ' Set container properties MediaPlayer1.CreateControl MediaPlayer1.Parent = Frame1 ' Assuming Frame1 is your container MediaPlayer1.Left = 0 MediaPlayer1.Top = 0 MediaPlayer1.Width = Frame1.ScaleWidth MediaPlayer1.Height = Frame1.ScaleHeight ' Configure initial state MediaPlayer1.DoubleBuffered = True MediaPlayer1.AutoPlay = False End Sub Private Sub btnPlay_Click() ' Set the file name - use full path for reliability MediaPlayer1.FilenameOrURL = "C:\Videos\multistream-video.mkv" ' Enable audio playback MediaPlayer1.Audio_Play = True MediaPlayer1.Audio_Volume = 85 ' Set volume to 85% ' Set the source mode to DirectShow MediaPlayer1.Source_Mode = SM_File_DS ' Select the second video stream (index 1) MediaPlayer1.Source_VideoStreamIndex = 1 ' Start playback MediaPlayer1.Play End Sub Private Sub Form_Unload(Cancel As Integer) ' Clean up resources Set MediaPlayer1 = Nothing End Sub ``` ## Conclusion The ability to play video files with multiple streams opens up numerous possibilities for creating rich, interactive multimedia experiences. The `TVFMediaPlayer` component provides a straightforward approach to implementing this functionality, with flexible options to suit different application requirements. By following the techniques outlined in this guide, you can effectively incorporate multiple video stream support into your applications, enhancing user experience and expanding the capabilities of your multimedia projects. --- Please get in touch with [support](https://support.visioforge.com/) if you need assistance with this functionality. Visit our [GitHub](https://github.com/visioforge/) page for additional code samples and implementation examples. ---END OF PAGE--- # Local File: .\delphi\mediaplayer\index.md --- title: Media Player SDK for Delphi and ActiveX Development description: Comprehensive media playback SDK for Delphi and ActiveX applications. Features rich format support, advanced playback controls, video processing, network streaming, and seamless integration capabilities for Windows development. sidebar_label: TVFMediaPlayer --- # TVFMediaPlayer: Feature-Rich Media Playback for Delphi & ActiveX ## Introduction to TVFMediaPlayer The VisioForge TVFMediaPlayer library stands as a powerful and versatile solution designed for developers working with Delphi (VCL) and ActiveX-compatible environments (like .NET WinForms/WPF, VB6). It provides a robust framework for integrating sophisticated multimedia playback capabilities directly into custom applications. Whether you're building a simple video viewer, a complex media center application, a surveillance system interface, or interactive training software, TVFMediaPlayer offers the tools needed to handle a diverse range of audio and video requirements. At its core, the library abstracts the complexities of various media codecs and streaming protocols, presenting a unified and relatively straightforward API. This allows developers to focus on application logic rather than low-level multimedia handling. The library emphasizes performance, stability, and extensive format support, making it a reliable choice for demanding playback scenarios. ## Core Features and Capabilities TVFMediaPlayer is packed with features designed to address common and advanced media playback needs. ### Extensive Format and Codec Support One of the library's most significant strengths is its ability to play back a vast array of media formats. This is achieved through flexible backend support: * **System Codecs:** Leverages codecs already installed on the Windows operating system (DirectShow/Media Foundation). Ideal for common formats like AVI, WMV, and MP3 when appropriate decoders are present. * **FFmpeg:** Integrates the renowned FFmpeg libraries, providing built-in support for a huge number of video and audio codecs and container formats without requiring external installations. This ensures broad compatibility out-of-the-box. * **VLC Engine (libVLC):** Option to utilize the VLC engine, known for its excellent handling of various stream types and potentially problematic files. This multi-pronged approach ensures that your application can handle almost any media file or stream thrown at it, minimizing compatibility issues for end-users. ### Advanced Playback Control Beyond basic Play, Pause, Stop, and Seek operations, TVFMediaPlayer offers fine-grained control: * **Variable Playback Rate:** Adjust playback speed (faster or slower) while optionally maintaining audio pitch. * **Frame-Stepping:** Navigate video content frame by frame, essential for analysis or precise editing tasks. * **Volume and Audio Control:** Adjust volume, mute audio, and potentially select specific audio tracks if multiple are available. * **Seamless Looping:** Configure specific segments or the entire media file to loop continuously. ### Video Processing and Enhancement Enhance the visual experience and extract information from video streams: * **Overlays:** Easily add text, images (with transparency), or even graphical elements on top of the video playback. Useful for watermarking, displaying information, or custom controls. * **Video Effects:** Apply real-time video effects such as brightness, contrast, saturation, hue adjustments, grayscale, inversion, and potentially more complex filters. * **Frame Capture:** Capture snapshots of the currently playing video frame and save them to various image formats (BMP, JPG, PNG). This is useful for thumbnail generation, analysis, or documentation. * **Zoom and Pan:** Allow users to digitally zoom into specific areas of the video and pan the view. ### Audio Processing and Enhancements Refine the audio output: * **Audio Equalizer:** Provide users with a multi-band equalizer to tailor the audio output to their preferences or environment. * **Audio Enhancements:** Features like volume boosting beyond standard levels might be available. * **Track Selection:** Explicitly select from multiple available audio tracks within a media file. ### Network Stream Playback Effortlessly play streams from network sources: * **Supported Protocols:** Handles common streaming protocols like HTTP, HTTPS, HLS (HTTP Live Streaming), RTSP, RTMP, and MMS. * **Buffering Control:** Manage buffering settings to balance startup latency and playback smoothness, crucial for varying network conditions. ### Specialized Playback Features * **Multi-Stream Files:** Uniquely handles video files containing multiple video streams (e.g., different camera angles), allowing seamless switching between them during playback. * **DVD and Blu-ray:** Supports playback from DVD and Blu-ray discs, including menu navigation and chapter selection (requires appropriate system support and potentially decryption libraries for commercial discs). * **Subtitle Integration:** Load and display subtitles from external files (like SRT, ASS, SSA, VobSub) or embedded subtitle tracks. Customize font, size, color, and position. ## Integration and Development TVFMediaPlayer is designed for ease of integration into Delphi (VCL) and ActiveX host applications. ### Delphi Integration (VCL) For Delphi developers, the library typically provides native VCL components. These components can be dropped onto a form in the IDE, and their properties and events can be configured visually and programmatically. This component-based approach significantly speeds up development compared to using raw APIs. ### ActiveX Integration The ActiveX control allows the media player to be embedded in any environment supporting ActiveX technology. This includes older platforms like Visual Basic 6, as well as .NET applications (Windows Forms, WPF) and even some web pages (though ActiveX in browsers is largely deprecated for security reasons). The ActiveX control exposes properties, methods, and events similar to the native Delphi components. ## Licensing Model VisioForge typically offers flexible licensing: * **Trial Version:** A fully functional trial version is usually available, allowing developers to evaluate the library thoroughly. Trial versions often overlay a watermark or display a nag screen. * **Full License:** Purchasing a full license removes trial limitations. Full licenses offer free updates and priority support for one year. This ensures that developers have ongoing access to improvements and technical assistance. It's crucial to consult the official VisioForge website or licensing documentation for precise terms and conditions. ## Resources and Further Information To delve deeper into the capabilities and usage of the TVFMediaPlayer library, explore the following official resources: * **Product Page:** [VisioForge Media Player SDK](https://www.visioforge.com/all-in-one-media-framework) * **API Documentation:** [Delphi Media Player API Reference](https://api.visioforge.com/delphi/media_player_sdk/index.html) * **Changelog:** [View recent updates and fixes](changelog.md) * **Installation Guide:** [Steps for setting up the library](install/index.md) * **Deployment:** [Information on distributing your application](deployment.md) * **License Agreement:** [End User License Agreement](../../eula.md) ## Tutorials and Code Samples Practical examples demonstrate how to implement specific features: * [How to play a video file with several video streams?](file-multiple-video-streams.md) * *(More tutorials can be added here as they become available)* By leveraging the extensive features and flexible integration options of TVFMediaPlayer, developers can create compelling multimedia applications with rich playback experiences across various Windows platforms. ---END OF PAGE--- # Local File: .\delphi\mediaplayer\install\builder.md --- title: TVFMediaPlayer Installation in C++ Builder description: A detailed walkthrough on installing the TVFMediaPlayer component within Embarcadero C++ Builder (versions 5, 6, 2006, and later) sidebar_label: C++ Builder --- # Installing TVFMediaPlayer in C++ Builder Welcome to the detailed guide for integrating the powerful TVFMediaPlayer library into your Embarcadero C++ Builder development environment. This document covers the installation process for legacy versions like C++ Builder 5 and 6, as well as modern versions from 2006 onwards. We will explore the necessary prerequisites, step-by-step installation procedures for different IDE versions, considerations for 32-bit (x86) and 64-bit (x64) architectures, and common troubleshooting steps. ## Introduction to TVFMediaPlayer and VisioForge Media Framework TVFMediaPlayer is a versatile multimedia component developed by VisioForge. It's part of the larger VisioForge Media Framework, designed to provide developers with a robust set of tools for handling audio and video playback, capture, processing, and streaming within their applications. TVFMediaPlayer specifically focuses on playback capabilities, supporting a wide array of formats and offering extensive control over media rendering. The component is delivered as an ActiveX control, making it easily integrable into environments that support COM technology, such as C++ Builder. Utilizing ActiveX allows for visual design-time integration and straightforward programmatic access to the player's features. ## Prerequisites Before proceeding with the installation, ensure your development environment meets the following requirements: 1. **Supported C++ Builder Version:** You need a working installation of Embarcadero C++ Builder. This guide covers: * C++ Builder 5 * C++ Builder 6 * C++ Builder 2006 * C++ Builder 2007, 2009, 2010, XE series (XE to XE8), 10.x series (Seattle, Berlin, Tokyo, Rio, Sydney), 11.x (Alexandria), and later versions. While the core process remains similar for newer versions, minor UI variations might exist. 2. **Operating System:** A compatible Windows operating system (Windows 7 or later, including Windows 8, 10, 11, and corresponding Server versions). Ensure your OS matches the target architecture (32-bit or 64-bit) of your C++ Builder projects. 3. **Administrative Privileges:** The installation of the VisioForge Media Framework and the registration of ActiveX controls typically require administrative privileges on your machine. Ensure you are running the installer and C++ Builder with sufficient permissions, especially if User Account Control (UAC) is enabled. 4. **Dependencies:** The VisioForge installer usually bundles necessary runtime dependencies (like specific DirectX or Media Foundation components). However, keeping your Windows system updated is generally recommended. ## Step 1: Download the All-in-One Media Framework The TVFMediaPlayer component is distributed as part of the VisioForge All-in-One Media Framework SDK. You must download the correct version: * **Target:** Download the **ActiveX** version of the SDK. Do not download the .NET or VCL versions, as they are intended for different development environments. * **Source:** Obtain the installer directly from the official VisioForge website. Navigate to the [product page](https://www.visioforge.com/all-in-one-media-framework) and locate the download link for the ActiveX SDK. Ensure you are downloading the latest stable release unless you have specific requirements for an older version. ## Step 2: Install the VisioForge Media Framework Once the download is complete, proceed with the installation: 1. **Locate the Installer:** Find the downloaded executable file. 2. **Run as Administrator:** Right-click the installer file and select "Run as administrator". This is crucial for ensuring the ActiveX controls are correctly registered in the Windows Registry. 3. **Follow the Wizard:** The installation wizard will guide you through the process. * Accept the license agreement. * Choose the installation directory (the default location is usually suitable). * Select the components to install. Ensure that the core framework and the MediaPlayer components are selected. Typically, the default selection is sufficient. * The installer will copy the necessary files (DLLs, AX files, etc.) and register the ActiveX controls on your system. 4. **Completion:** Once the installation finishes, click "Finish". The TVFMediaPlayer ActiveX control is now available on your system, ready to be imported into the C++ Builder IDE. ## Step 3: Import the TVFMediaPlayer ActiveX Control into C++ Builder The method for importing the ActiveX control differs slightly between older and newer versions of C++ Builder. ### A. For C++ Builder 5 and 6 These classic versions have a straightforward import mechanism: 1. **Launch C++ Builder:** Open your C++ Builder 5 or 6 IDE. 2. **Open or Create a Project:** You can import the control into an existing project or a new one. The import process adds the component to the IDE's palette, making it available for all projects. 3. **Import ActiveX Control:** Navigate to the main menu and select `Component` → `Import ActiveX Controls...`. ![C++ Builder 5/6 - Component Menu](mpbcb5_1.webp) 4. **Select the Control:** A dialog box will appear listing all registered ActiveX controls on your system. Scroll through the list and find `VisioForge Media Player` (it might also be listed as `VFMediaPlayer Class` or similar, depending on registry details). Check the box next to it. ![C++ Builder 5/6 - Select Control](mpbcb5_2.webp) 5. **Install:** Click the `Install...` button. 6. **Package Creation/Selection:** C++ Builder will prompt you to install the component into a package. You can choose an existing package (like `dclusr.dpk`) or create a new one. For simplicity, adding it to the default user package is often sufficient. Click `OK`. 7. **Confirmation:** A confirmation dialog will ask if you want to rebuild the package. Click `Yes`. ![C++ Builder 5/6 - Rebuild Package Confirmation](mpbcb5_3.webp) 8. **Compilation and Installation:** C++ Builder will compile the package containing the wrapper code for the ActiveX control. Upon successful compilation, a message will confirm the installation. Click `OK`. ![C++ Builder 5/6 - Installation Successful](mpbcb5_4.webp) 9. **Component Palette:** The TVFMediaPlayer component should now appear on the C++ Builder Component Palette, likely under a tab named `ActiveX` or `VisioForge`. You can now drag and drop it onto your forms like any other standard VCL component. ### B. For C++ Builder 2006 and Later (including XE, 10.x, 11.x) Modern C++ Builder versions use a more structured component import process, typically involving creating or using a dedicated design-time package: 1. **Launch C++ Builder:** Open your C++ Builder IDE (2006 or newer). 2. **Create a New Package:** It's generally best practice to install third-party components into their own package. * Go to `File` → `New` → `Other...`. * In the `New Items` dialog, navigate to `C++Builder Projects` (or similar category) and select `Package`. Click `OK`. ![C++ Builder 2006+ - New Package](mpbcb2006_1.webp) 3. **Import Component:** With the new package project active (e.g., `Package1.cbproj`), go to the main menu and select `Component` → `Import Component...`. ![C++ Builder 2006+ - Component Menu](mpbcb2006_2.webp) 4. **Select Import Type:** In the `Import Component` wizard, choose the `Import ActiveX Control` option and click `Next >`. ![C++ Builder 2006+ - Select Import Type](mpbcb2006_3.webp) 5. **Select the Control:** Similar to the older versions, find `VisioForge Media Player` in the list of registered controls, select it, and click `Next >`. ![C++ Builder 2006+ - Select Control](mpbcb2006_4.webp) 6. **Component Details:** The wizard will display details about the control. You can typically accept the defaults for `Palette Page` (e.g., `ActiveX`), `Unit Dir Name`, and `Search Path`. Click `Next >`. *Note: Some developers prefer to create a dedicated "VisioForge" palette page.* 7. **Package Selection:** Choose the action `Add unit to .cbproj` (where `` is the name of the package you created in step 2). Click `Finish`. ![C++ Builder 2006+ - Choose Package Action](mpbcb2006_6.webp) *Self-Correction: The screenshot reference 'mpbcb2006_5.webp' seems misplaced in the original document's flow. It likely referred to saving or build options, which are handled next.* 8. **Save the Package:** C++ Builder will generate the necessary wrapper unit (e.g., `VFMediaPlayerLib_TLB.cpp` / `.h`). Save the package project (`.cbproj`) and the associated files when prompted. Choose a meaningful name and location for your package (e.g., `VisioForgeMediaPlayerPkg`). ![C++ Builder 2006+ - Save Package](mpbcb2006_7.webp) 9. **Compile and Install the Package:** * In the `Project Manager` pane, right-click on the package project's `.bpl` file (e.g., `VisioForgeMediaPlayerPkg.bpl`). * Select `Compile` to ensure the wrapper code builds correctly. * After a successful compilation, right-click the `.bpl` file again and select `Install`. 10. **Confirmation:** The IDE will install the package, making the TVFMediaPlayer component available on the specified Component Palette page (e.g., `ActiveX`). ## Step 4: Using the TVFMediaPlayer Component After successful installation, you can use the component in your C++ Builder applications: 1. **Design-Time:** Open a form in the Form Designer. Locate the `TVFMediaPlayer` component on the Component Palette (usually on the `ActiveX` or `VisioForge` tab). Click and drop it onto your form. You can resize and position it as needed. Use the Object Inspector to configure its basic properties. 2. **Run-Time:** Access the component's methods and properties programmatically in your C++ code. For example, to load and play a file: ```cpp // Assuming MediaPlayer1 is the name of the TVFMediaPlayer component on your form MediaPlayer1->Filename = "C:\\path\\to\\your\\video.mp4"; MediaPlayer1->Play(); ``` 3. **Event Handling:** Use the Object Inspector's `Events` tab to assign handlers to various player events (e.g., `OnPlay`, `OnStop`, `OnError`). ## Architecture Considerations (x86 vs. x64) The VisioForge Media Framework provides both 32-bit (x86) and 64-bit (x64) versions of its libraries and ActiveX controls. It's crucial to match the component architecture with your C++ Builder project's target platform: * **32-bit Projects (Win32 Target Platform):** Use the x86 version of the TVFMediaPlayer ActiveX control. The standard installation typically registers the x86 version correctly. When importing/installing the component package (especially in modern IDEs), ensure you are building and installing the package for the Win32 platform. * **64-bit Projects (Win64 Target Platform):** Use the x64 version of the TVFMediaPlayer ActiveX control. The VisioForge installer should register both versions. * **IDE Design-Time:** Importantly, the C++ Builder IDE itself is often a 32-bit application (even in recent versions). This means that for visual form design, the IDE needs to load the **x86** version of the ActiveX control. * **Compilation/Runtime:** When you compile your project for the Win64 target platform, the application will require the **x64** version of the control at runtime. * **Package Management:** In modern C++ Builder versions, you might need to: 1. Create and install a design-time package targeting Win32 (using the x86 control) for use in the IDE. 2. Ensure the corresponding runtime package (or necessary library files) for Win64 are correctly configured in your project's build settings and deployed with your 64-bit application. Consult the VisioForge documentation and C++ Builder's platform management features for specifics. Some developers manage separate packages for Win32 and Win64 targets. **Recommendation:** While legacy C++ Builder versions are covered, VisioForge strongly recommends using modern versions of C++ Builder (XE series or later). These versions offer better support for 64-bit development, improved IDE features, and compatibility with current Windows operating systems and VisioForge SDK updates. Support for C++ Builder 5/6 might be limited. ## Troubleshooting Common Issues * **Control Not Found in Import List:** Ensure the VisioForge Media Framework (ActiveX version) was installed correctly with administrative privileges. Try reinstalling the framework. Manually registering the `.ocx` or `.ax` file using `regsvr32` (run from an Administrator command prompt) might be necessary in rare cases (e.g., `regsvr32 "C:\Program Files (x86)\VisioForge\Media Framework\VFMediaPlayer.ax"` - adjust path as needed). * **Package Installation Fails:** Check the build output for errors. Ensure the package project settings (paths, target platform) are correct. Verify you have write permissions to the C++ Builder library/package directories. * **Component Works in IDE but Fails at Runtime (or vice-versa):** This often points to an architecture mismatch (x86 vs. x64). Review the "Architecture Considerations" section carefully. Ensure the correct version (32-bit or 64-bit) of the VisioForge runtime files is accessible to your compiled application. Deploy the required VisioForge redistributables with your application if necessary. * **Errors During Playback (`CreateObject` fails, etc.):** Double-check that the `Filename` property points to a valid, accessible media file. Ensure the necessary codecs for the media format are installed on the system (though VisioForge often includes internal decoders or utilizes Media Foundation/DirectShow). Check the VisioForge `OnError` event for specific error codes or messages. ## Conclusion Integrating TVFMediaPlayer into C++ Builder provides a powerful solution for adding media playback to your applications. By following the appropriate steps for your IDE version, carefully managing x86/x64 architectures, and understanding the package system, you can successfully incorporate this component. Remember to consult the official VisioForge documentation and examples for more advanced usage and API details. --- For further assistance or specific issues not covered here, please contact VisioForge [support](https://support.visioforge.com/). Explore more advanced examples and source code on the VisioForge [GitHub](https://github.com/visioforge/) repository. ---END OF PAGE--- # Local File: .\delphi\mediaplayer\install\delphi.md --- title: TVFMediaPlayer Installation in Delphi description: A detailed walkthrough on installing the TVFMediaPlayer library in various Delphi versions (6, 7, 2005, and later), covering prerequisites, configuration, verification, and troubleshooting. sidebar_label: Delphi Installation --- # Installing TVFMediaPlayer in Delphi Welcome to the detailed guide for installing the VisioForge Media Player SDK, specifically the `TVFMediaPlayer` component, into your Delphi development environment. This guide covers installations for classic Delphi versions like Delphi 6 and 7, as well as modern versions from Delphi 2005 onwards, including the latest releases supporting 64-bit development. ## Understanding TVFMediaPlayer `TVFMediaPlayer` is a powerful VCL component from VisioForge designed for seamless integration of video and audio playback capabilities into Delphi applications. It simplifies tasks such as playing various media formats, capturing snapshots, controlling playback speed, managing audio streams, and much more. Built upon a robust media engine, it offers high performance and extensive format support, making it a versatile choice for multimedia application development in Delphi. This guide assumes you have a working installation of Embarcadero Delphi or a compatible older version (Borland Delphi). ## Step 1: Prerequisites and Downloading the Framework Before proceeding with the installation, ensure your development environment meets the necessary prerequisites. Primarily, you need a licensed or trial version of Delphi installed on your Windows machine. The `TVFMediaPlayer` component is distributed as part of the VisioForge All-in-One Media Framework. This framework bundles various VisioForge SDKs, providing a comprehensive toolkit for media handling. 1. **Navigate to the Product Page:** Open your web browser and go to the official VisioForge [All-in-One Media Framework product page](https://www.visioforge.com/all-in-one-media-framework). 2. **Select the Delphi Version:** Locate the download section specifically for Delphi. VisioForge typically offers versions tailored for different development platforms. 3. **Download:** Click the download link to obtain the installer executable (`.exe`) file. Save this file to a known location on your computer, such as your Downloads folder. The downloaded file contains not only the `TVFMediaPlayer` component but also other related libraries, source code (if applicable based on licensing), necessary runtime files, and documentation. ## Step 2: Running the Installer Once the download is complete, you need to run the installer to place the necessary SDK files onto your system. 1. **Locate the Installer:** Navigate to the folder where you saved the downloaded `.exe` file. 2. **Run as Administrator:** Right-click the installer file and select "Run as administrator". This is crucial because the installer needs to register components and potentially write to system directories, requiring elevated privileges. 3. **Follow On-Screen Instructions:** The installer wizard will guide you through the process. Typically, this involves: * Accepting the license agreement. * Choosing the installation directory (the default location is usually appropriate, e.g., within `C:\Program Files (x86)\VisioForge\` or similar). Note this path, as you'll need it later. * Selecting components to install (ensure the Media Player SDK is selected). * Confirming the installation. 4. **Complete Installation:** Allow the installer to finish copying files and performing necessary setup tasks. This process unpacks the SDK, including source files (`.pas`), pre-compiled units (`.dcu`), package files (`.dpk`, `.bpl`), and potentially required DLLs. ## Step 3: Integrating with the Delphi IDE After running the main installer, the next critical step is integrating the `TVFMediaPlayer` component into the Delphi IDE so you can use it visually in the form designer and reference its units in your code. The process differs slightly between older (Delphi 6/7) and newer (Delphi 2005+) versions. **Important:** For all Delphi versions, it's recommended to run the Delphi IDE itself **as administrator** during the package installation process. This helps avoid potential permission issues when compiling and registering the component package. ### Installation in Delphi 6 / Delphi 7 These older versions require manual configuration of paths and package installation. 1. **Launch Delphi (as Administrator):** Start your Delphi 6 or Delphi 7 IDE with administrative privileges. 2. **Open IDE Options:** Go to the `Tools` menu and select `Environment Options`. 3. **Configure Library Path:** * Navigate to the `Library` tab. * In the `Library path` field, click the ellipsis (`...`) button. * Click the `Add` or `New` button (icon might vary) and browse to the `Source` directory within the VisioForge installation path you noted earlier (e.g., `C:\Program Files (x86)\VisioForge\Media Player SDK\Source`). Add this path. This tells Delphi where to find the `.pas` source files if needed during compilation or debugging. * Click `OK` to close the path editor. 4. **Configure Browsing Path:** * While still in the `Library` tab, locate the `Browsing path` field (it might be combined or separate depending on the exact Delphi version/update). * Add the same `Source` directory path here as well. This helps the IDE locate files for features like code completion and navigation. * Click `OK` to save the Environment Options. 5. **Open the Package File:** * Go to the `File` menu and select `Open...`. * Navigate to the `Packages\Delphi7` (or `Delphi6`) subfolder within the VisioForge installation directory (e.g., `C:\Program Files (x86)\VisioForge\Media Player SDK\Packages\Delphi7`). * Locate the runtime package file, often named something like `VFMediaPlayerD7_R.dpk` (the 'R' usually denotes runtime). Open it. * Repeat the process to open the design-time package, often named `VFMediaPlayerD7_D.dpk` (the 'D' denotes design-time). 6. **Compile the Runtime Package:** * Ensure the runtime package (`*_R.dpk`) is the active project in the Project Manager. * Click the `Compile` button in the Project Manager window (or use the corresponding menu option, e.g., `Project -> Compile`). Resolve any compilation errors if they occur (though typically unnecessary with official packages). 7. **Compile and Install the Design-Time Package:** * Make the design-time package (`*_D.dpk`) the active project. * Click the `Compile` button. * Once compiled successfully, click the `Install` button in the Project Manager. 8. **Confirmation:** You should see a confirmation message indicating that the package(s) were installed. The `TVFMediaPlayer` component (and potentially others from the SDK) should now appear on the Delphi component palette, likely under a "VisioForge" or similar category tab. *Note on Architecture:* Delphi 6/7 are strictly 32-bit (x86) environments. Therefore, you will only be installing and using the 32-bit version of the `TVFMediaPlayer` component. The SDK might contain 64-bit files, but they are not applicable here. ### Installation in Delphi 2005 and Later (XE, 10.x, 11.x, 12.x) Modern Delphi versions offer a more streamlined process and robust support for multiple platforms (Win32, Win64). 1. **Launch Delphi (as Administrator):** Start your Delphi IDE (e.g., Delphi 11 Alexandria, Delphi 12 Athens) with administrative privileges. 2. **Open IDE Options:** Go to `Tools -> Options`. 3. **Configure Library Path:** * In the Options dialog, navigate to `Language -> Delphi -> Library` (the exact path might slightly vary between versions). * Select the target platform for which you want to configure the path (e.g., `Windows 32-bit`, `Windows 64-bit`). It's recommended to configure both if you plan to build for both architectures. * Click the ellipsis (`...`) button next to the `Library path` field. * Add the path to the appropriate `Source` directory within the VisioForge installation (e.g., `C:\Program Files (x86)\VisioForge\Media Player SDK\Source`). * Click `Add` and then `OK`. Repeat for the other platform if desired. 4. **Configure Browsing Path (Optional but Recommended):** * Under the same `Library` section, add the `Source` path to the `Browsing path` field as well. * Click `OK` to save the Options. 5. **Open the Package File:** * Go to `File -> Open Project...`. * Navigate to the `Packages` directory within the VisioForge installation. Find the subfolder corresponding to your Delphi version (e.g., `Delphi11`, `Delphi12`). * Open the appropriate design-time package file (e.g., `VFMediaPlayerD11_D.dpk`). Modern Delphi often manages runtime/design-time dependencies more automatically, so you might only need to explicitly open the design-time package. 6. **Compile and Install:** * In the Project Manager, right-click on the package project (`.dpk` file). * Select `Compile` from the context menu. * Once compiled successfully, right-click again and select `Install`. 7. **Confirmation:** Delphi will confirm the installation, and the components will appear on the palette. *Note on Architecture:* Modern Delphi supports both 32-bit (Win32) and 64-bit (Win64) targets. The VisioForge SDK typically provides pre-compiled units (`.dcu`) for both. When you compile and install the package, Delphi usually handles registering it for the currently active platform. You can switch platforms in the Project Manager and rebuild/reinstall if necessary, although often the IDE handles this association correctly after the initial install. ## Step 4: Project Configuration After installing the component package into the IDE, you need to ensure your individual *projects* can find the necessary VisioForge files at compile and runtime. 1. **Project Options:** Open your Delphi project (`.dpr` file). Go to `Project -> Options`. 2. **Library Path:** Navigate to `Delphi Compiler -> Search path` (or similar depending on version). 3. **Add SDK Path:** For each target platform (`Windows 32-bit`, `Windows 64-bit`) you intend to use: * Add the path to the VisioForge `Source` directory (e.g., `C:\Program Files (x86)\VisioForge\Media Player SDK\Source`). This ensures the compiler can find the `.pas` files or required `.dcu` files. Sometimes, pre-compiled `.dcu` files are provided in platform-specific subdirectories (e.g., `DCU\Win32`, `DCU\Win64`); if so, add those specific paths instead of or in addition to the main `Source` path. Check the VisioForge documentation or installation structure for specifics. 4. **Save Changes:** Click `OK` or `Save` to apply the project options. Setting the project search path correctly is crucial. If the compiler complains about not finding units like `VisioForge_MediaPlayer_Engine` or similar, incorrect or missing search paths are the most common cause. ## Step 5: Verification To confirm the installation was successful: 1. **Check Component Palette:** Look for the "VisioForge" tab (or similar) on the component palette in the Delphi IDE. You should see the `TVFMediaPlayer` icon. 2. **Create a Test Application:** * Create a new VCL Forms Application (`File -> New -> VCL Forms Application - Delphi`). * Drag and drop the `TVFMediaPlayer` component from the palette onto the main form. * If the component appears on the form without errors, the design-time installation is likely correct. * Add a simple button. In its `OnClick` event handler, add a basic line of code to interact with the player, for example: ```delphi procedure TForm1.Button1Click(Sender: TObject); begin // Ensure VFMediaPlayer1 is the name of your component instance VFMediaPlayer1.Filename := 'C:\path\to\your\test_video.mp4'; // Replace with an actual media file path VFMediaPlayer1.Play(); end; ``` * Compile the project (`Project -> Compile`). If it compiles without "File not found" errors related to VisioForge units, the path configuration is likely correct. * Run the application. If it runs and you can play the media file using the button, the runtime setup is working. ## Common Installation Problems and Troubleshooting While the process is generally straightforward, occasional issues can arise: * **IDE Permissions:** Forgetting to run the Delphi IDE as administrator during package installation can lead to errors writing to registry or system folders, preventing component registration. **Solution:** Close Delphi, restart it as administrator, and try the package installation steps again. * **Path Configuration Errors:** Incorrect paths in either the IDE `Library Path` or the project's `Search Path` are common. **Solution:** Double-check that the paths point *exactly* to the VisioForge SDK's `Source` (or relevant `DCU`) directory. Ensure paths are correct for the specific target platform (Win32/Win64). * **Package Compilation Errors:** Sometimes, conflicts with other installed packages or issues within the package source itself can cause compilation failures. **Solution:** Ensure you are using the correct package version for your specific Delphi version. Consult VisioForge support or forums if errors persist. * **64-bit Specific Issues:** Installing packages for the 64-bit platform can sometimes present unique challenges, especially in older Delphi versions that first introduced Win64 support. Refer to the linked article [Delphi 64-bit package installation problem](../../general/install-64bit.md) for specific known issues and workarounds. * **`.otares` File Issues:** Some Delphi versions utilize `.otares` files for resources. Problems during package installation related to these files can occur. See the linked article [Delphi package installation problem with .otares](../../general/install-otares.md). * **Missing Runtime DLLs:** The `TVFMediaPlayer` often relies on underlying DLLs (e.g., FFmpeg components) for its functionality. While the main installer usually handles these, ensure they are correctly placed either in your application's output directory, a directory in the system PATH, or the System32/SysWOW64 folders as appropriate. Deployment requires distributing these necessary DLLs with your application. Check the VisioForge documentation for a list of required runtime files. ## Further Steps and Resources With `TVFMediaPlayer` successfully installed, you can now explore its extensive features. * **Explore Properties and Events:** Use the Delphi Object Inspector to examine the numerous properties and events available for the `TVFMediaPlayer` component. * **Consult Documentation:** Refer to the official VisioForge documentation installed with the SDK or available online for detailed API references and usage examples. * **Code Samples:** Visit the VisioForge [GitHub repository](https://github.com/visioforge/) to find demo projects and code snippets showcasing various functionalities. * **Seek Support:** If you encounter persistent issues or have specific questions not covered here, contact [VisioForge support](https://support.visioforge.com/) for assistance. --- Please get in touch with [support](https://support.visioforge.com/) to get help with this tutorial. Visit our [GitHub](https://github.com/visioforge/) page to get more code samples. ---END OF PAGE--- # Local File: .\delphi\mediaplayer\install\index.md --- title: Comprehensive Guide to Installing TVFMediaPlayer Library description: Detailed instructions on installing the TVFMediaPlayer library in Delphi, C++ Builder, Visual Basic 6, Visual Studio, and other ActiveX-compatible environments. sidebar_label: Installation Guide --- # Comprehensive TVFMediaPlayer Library Installation Guide Welcome to the detailed installation guide for the VisioForge TVFMediaPlayer library, a core component of the powerful All-in-One Media Framework. This guide provides comprehensive steps for installing the library across various Integrated Development Environments (IDEs), ensuring you can leverage its rich media playback capabilities effectively in your projects. The TVFMediaPlayer library offers developers robust tools for integrating audio and video playback, processing, and streaming functionalities into their applications. It is available in two primary forms to cater to different development ecosystems: 1. **Native Delphi Package:** Optimized specifically for Embarcadero Delphi developers, offering seamless integration, design-time support, and leveraging the full potential of the VCL framework. 2. **ActiveX Control (OCX):** Designed for broad compatibility, allowing integration into environments that support ActiveX technology, such as C++ Builder, Microsoft Visual Basic 6 (VB6), Microsoft Visual Studio (for C#, VB.NET, C++ MFC projects), and other ActiveX containers. This dual availability ensures that whether you are working within the Delphi ecosystem or utilizing other popular development tools, you can harness the power of TVFMediaPlayer. ## Before You Begin: System Requirements and Prerequisites Before proceeding with the installation, ensure your development environment meets the necessary requirements: * **Operating System:** Windows 7, 8, 8.1, 10, 11, or Windows Server 2012 R2 and newer (both x86 and x64 versions are supported). * **Development Environment:** A compatible IDE such as: * Embarcadero Delphi (refer to specific framework version for compatible Delphi releases, typically XE2 or newer). * Embarcadero C++ Builder (refer to specific framework version for compatibility). * Microsoft Visual Studio 2010 or newer (for C#, VB.NET, C++ MFC development using ActiveX). * Microsoft Visual Basic 6 (requires the IDE installed). * Any other IDE or development tool capable of hosting ActiveX controls. * **Dependencies:** * **DirectX:** Microsoft DirectX 9 or later is generally required. While modern Windows versions include compatible DirectX runtimes, ensure they are up-to-date. * **.NET Framework (for .NET usage):** If using the ActiveX control within .NET applications (C#, VB.NET), ensure the appropriate .NET Framework version targeted by your project is installed. * **Administrator Privileges:** Running the installer typically requires administrator rights to register components and write to system directories. ## Step-by-Step General Installation Process The core installation process involves downloading the All-in-One Media Framework installer and running it. Follow these steps carefully: 1. **Download the Framework:** * Navigate to the official [All-in-One Media Framework product page](https://www.visioforge.com/all-in-one-media-framework) on the VisioForge website. * Locate the downloads section. You might find different versions (e.g., Trial, Full) or builds. Download the latest stable release suitable for your needs. Pay attention to whether you need the Delphi-specific package installer or the general ActiveX installer if they are provided separately (often, one installer contains both). * Save the installer executable (`.exe`) file to a convenient location on your computer. 2. **Run the Installer:** * Locate the downloaded setup file (e.g., `visioforge_media_framework_setup.exe`). * Right-click the file and select "Run as administrator" to ensure necessary permissions. * If prompted by User Account Control (UAC), confirm that you want to allow the installer to make changes to your device. 3. **Follow the Installation Wizard:** * **Welcome Screen:** The installer will launch, typically starting with a welcome message. Click "Next" to proceed. * **License Agreement:** Read the End-User License Agreement (EULA) carefully. You must accept the terms to continue the installation. Select the appropriate option and click "Next". * **Select Destination Location:** Choose the directory where the framework files, examples, and documentation will be installed. The default location is usually within `C:\Program Files (x86)\VisioForge\` or similar. You can browse for a different path if needed. Click "Next". * **Select Components (If Applicable):** Some installers might allow you to choose which components to install (e.g., specific framework features, documentation, examples for different languages). Ensure the core Media Player components and any relevant examples (Delphi, C#, VB.NET, C++, VB6) are selected. Click "Next". * **Select Start Menu Folder:** Choose the name for the Start Menu folder where shortcuts will be created. Click "Next". * **Ready to Install:** Review your selected options. If everything is correct, click "Install" to begin the file copying and system registration process. * **Installation Progress:** The wizard will show the progress of the installation. This may take a few minutes. During this phase, the necessary DLLs and OCX files are copied, and the ActiveX control is registered in the Windows Registry. * **Completion:** Once the installation is finished, you will see a completion screen. It might offer options to view documentation or launch an example project. Click "Finish" to exit the wizard. 4. **Post-Installation Verification:** * Navigate to the installation directory you selected (e.g., `C:\Program Files (x86)\VisioForge\Media Framework\`). * Verify that the core library files (`.dll`, `.ocx`), documentation (`.chm` or `Docs` folder), and example projects (`Examples` folder) are present. * Check the Start Menu folder for shortcuts to documentation and examples. * It's highly recommended to try compiling and running one of the provided sample projects for your specific IDE to confirm the installation was successful and the components are correctly registered and accessible. ## IDE-Specific Integration After the general installation, you need to integrate the TVFMediaPlayer library into your chosen development environment. ### Delphi (Native Packages) Using the native Delphi packages provides the best experience for Delphi developers, including design-time component integration. * **Detailed Guide:** For comprehensive instructions specific to Delphi, including adding the library path and installing the design-time and runtime packages (`.dpk` files), please refer to the dedicated **[Delphi Installation Guide](delphi.md)**. * **Key Benefits:** Direct component palette access, property inspectors, event handlers integrated within the IDE, and optimized performance for VCL applications. ### ActiveX Integration (C++ Builder, VB6, Visual Studio, etc.) If you are not using Delphi or prefer the ActiveX approach, you'll need to add the `TVFMediaPlayer.ocx` control to your project. #### C++ Builder Integrating the ActiveX control in C++ Builder involves importing it into the IDE. * **Detailed Guide:** Refer to the **[C++ Builder Installation Guide](builder.md)** for step-by-step instructions on importing the ActiveX control, which typically involves using the IDE's "Import Component" or "Import ActiveX Control" feature to generate necessary wrapper code. * **Process Overview:** This usually involves navigating `Component -> Import Component...`, selecting "Import ActiveX Control", finding the "VisioForge Media Player SDK" (or similar name) in the list of registered controls, and letting the IDE generate the corresponding C++ wrapper classes that allow you to interact with the control. #### Visual Basic 6 (VB6) VB6 relies heavily on ActiveX technology, making integration straightforward. 1. **Open Project:** Launch Visual Basic 6 and open your existing project or create a new one. 2. **Access Components Dialog:** Go to the main menu and select `Project -> Components...`. This will open the Components dialog box, listing registered controls. 3. **Locate and Select Control:** Scroll through the list under the "Controls" tab. Look for an entry like "VisioForge Media Player SDK Control" or similar (the exact name might vary slightly depending on the version). Check the box next to it. 4. **Add via Browse (If Not Listed):** If the control is not listed (perhaps due to a registration issue), click the "Browse..." button. Navigate to the VisioForge installation directory (specifically the `Redist\AnyCPU` or similar subfolder containing `TVFMediaPlayer.ocx`) and select the `.ocx` file. Click "Open". This should register and add the control to the list. Ensure its checkbox is ticked. 5. **Confirm:** Click "OK" or "Apply" in the Components dialog. 6. **Use Control:** The TVFMediaPlayer icon should now appear in your VB6 Toolbox. You can click and drag it onto your forms to use it visually. You can then interact with its properties and methods via code. #### Visual Studio (C#, VB.NET, C++ MFC) Visual Studio manages ActiveX controls through the COM Interoperability layer. 1. **Open Project:** Launch Visual Studio and open your Windows Forms (C# or VB.NET), WPF, or MFC project. 2. **Open Toolbox:** Ensure the Toolbox is visible (`View -> Toolbox`). 3. **Add Control to Toolbox:** * Right-click inside the Toolbox, preferably within a relevant tab like "General" or "All Windows Forms", or create a new tab (e.g., "VisioForge"). * Select "Choose Items...". * Wait for the "Choose Toolbox Items" dialog to load. This can sometimes take a moment as it scans registered components. * Navigate to the "COM Components" tab. * Scroll through the list and look for "VisioForge Media Player SDK Control" or a similar name. Check the box next to it. * **Add via Browse (If Not Listed):** If you cannot find it, click the "Browse..." button. Navigate to the VisioForge installation directory (usually the `Redist\AnyCPU` subfolder) and select the `TVFMediaPlayer.ocx` file. Click "Open". This should add it to the list; make sure its checkbox is now selected. * Click "OK". 4. **Use Control:** The TVFMediaPlayer control icon will now be available in your Visual Studio Toolbox. Drag and drop it onto your form (Windows Forms) or use it programmatically (WPF, MFC). Visual Studio will automatically generate the necessary Interop assemblies (wrappers) to allow managed code (.NET) or C++ to interact with the COM-based ActiveX control. ## Troubleshooting Common Installation Issues Encountering problems during installation? Here are some common issues and solutions: * **Control Not Registered / Not Appearing in IDE:** * Ensure the installer was run with administrator privileges. * Try manually registering the OCX file. Open an **Administrator Command Prompt**, navigate to the directory containing `TVFMediaPlayer.ocx` (e.g., `cd "C:\Program Files (x86)\VisioForge\Media Framework\Redist\AnyCPU"`), and run `regsvr32 TVFMediaPlayer.ocx`. A success message should appear. * Check for conflicts with other media libraries or older VisioForge versions. Consider uninstalling previous versions first. * **Installation Fails or Rolls Back:** * Ensure you meet all system requirements, including DirectX and .NET versions. * Temporarily disable antivirus software, which might interfere with the registration process. Remember to re-enable it afterward. * Check for sufficient disk space on the target drive. * **Issues in Specific IDEs:** * **Delphi:** Ensure the library path is correctly added in `Tools -> Options -> Library Path` and that the correct `BPL` files are installed. Rebuilding packages might help. * **Visual Studio:** Delete the `obj` and `bin` folders in your project, delete any existing Interop assemblies related to VisioForge, remove the control reference, restart Visual Studio, and try adding the control again. Ensure your project targets a compatible .NET Framework version if applicable. ## Updating the Framework To update to a newer version of the All-in-One Media Framework: 1. **Check for Compatibility:** Review the release notes for the new version to understand changes and potential compatibility issues with your existing projects. 2. **Backup Projects:** Always back up your projects before updating a major library dependency. 3. **Uninstall Existing Version (Recommended):** It's generally best practice to uninstall the current version via the Windows Control Panel ("Add or Remove Programs" or "Apps & features") before installing the new one. This helps prevent file conflicts or registration issues. 4. **Download and Install:** Download the new version's installer and follow the standard installation procedure outlined earlier in this guide. 5. **Recompile Projects:** Open your projects in their respective IDEs. You may need to remove and re-add references or components if the underlying interfaces have changed significantly (though this is less common with minor updates). Recompile your entire project. 6. **Test Thoroughly:** Test your application extensively to ensure all media functionalities work as expected with the updated library. ## Uninstallation To remove the TVFMediaPlayer library and the All-in-One Media Framework: 1. **Close IDEs:** Ensure all development environments that might be using the library files are closed. 2. **Use Windows Uninstaller:** * Go to the Windows Control Panel or Settings app. * Navigate to "Programs and Features" or "Apps & features". * Locate "VisioForge Media Framework" (or similar name) in the list of installed programs. * Select it and click "Uninstall". * Follow the prompts in the uninstallation wizard. This process should remove the installed files and attempt to unregister the ActiveX control. 3. **Manual Cleanup (Optional):** In some rare cases, or if you want to ensure a complete removal, you might manually check and delete: * The main installation directory (e.g., `C:\Program Files (x86)\VisioForge\`). * Any remaining configuration files or registry entries (advanced users only, proceed with caution). * Interop assemblies generated within your project folders (`obj`, `bin`). ## Licensing and Activation The All-in-One Media Framework typically operates under a commercial license, often with a trial period. * **Trial Version:** The downloaded installer might initially function as a trial, which may have limitations (e.g., nag screens, time limits, restricted features). * **Purchasing a License:** To unlock the full capabilities and use the framework in production applications, you must purchase a license from the VisioForge website. * **Activation:** After purchase, you will usually receive a license key or instructions on how to activate the software. This might involve entering the key into a specific property of the control at runtime or using a license activation tool provided by VisioForge. Refer to the documentation accompanying your purchased license for exact details. ## Getting Support If you encounter issues not covered here or need further assistance: * **Official Documentation:** Check the `Docs` folder in your installation directory or the online documentation on the VisioForge website. The `CHM` help file often contains detailed API references and usage examples. * **Sample Projects:** Explore the example projects provided for your IDE. They demonstrate common use cases and correct implementation techniques. * **VisioForge Support:** Visit the support section on the VisioForge website. This may include forums, a knowledge base, or direct contact options for licensed users. ## Conclusion Installing the TVFMediaPlayer library, whether as a native Delphi package or an ActiveX control, is a straightforward process when following these detailed steps. By ensuring system requirements are met, carefully executing the installation wizard, and correctly integrating the components into your chosen IDE, you can quickly begin developing powerful multimedia applications. Remember to consult the specific IDE guides (Delphi, C++ Builder) linked herein and the official documentation for deeper insights and advanced configurations. With the framework successfully installed, you are well-equipped to explore the extensive features of the VisioForge All-in-One Media Framework. ---END OF PAGE--- # Local File: .\delphi\mediaplayer\install\visual-basic-6.md --- title: Installing TVFMediaPlayer Library in Visual Basic 6 description: A detailed guide on how to install, integrate, and utilize the TVFMediaPlayer library within Microsoft Visual Basic 6. Learn about ActiveX integration, handling 32-bit limitations, implementing basic playback functionality, and proper deployment practices. sidebar_label: Visual Basic 6 --- # Integrating TVFMediaPlayer with Visual Basic 6: A Comprehensive Guide Microsoft Visual Basic 6 (VB6), despite its age, remains a relevant platform for many legacy applications. Its simplicity and rapid application development (RAD) capabilities made it incredibly popular. One way to extend the functionality of VB6 applications, particularly in multimedia processing, is by leveraging ActiveX controls. The TVFMediaPlayer library, developed by VisioForge, offers a powerful suite of multimedia features accessible to VB6 developers through its ActiveX interface. This guide provides a comprehensive walkthrough for installing, configuring, and utilizing the TVFMediaPlayer library within a Visual Basic 6 project. We will cover the nuances of working with ActiveX in VB6, address the inherent 32-bit limitations, and provide practical steps for integration and basic usage. ## Understanding ActiveX and VB6 Compatibility ActiveX controls are reusable software components based on Microsoft's Component Object Model (COM) technology. They allow developers to add specific functionalities to applications without writing the underlying code from scratch. Visual Basic 6 has excellent built-in support for ActiveX, enabling developers to easily incorporate third-party controls like TVFMediaPlayer into their projects via a graphical interface. This seamless integration means that VB6 developers can access the advanced multimedia capabilities of the VisioForge library—such as video playback, audio manipulation, screen capture, and network streaming—directly within the familiar VB6 IDE. ### The 32-bit Constraint A crucial point to understand is that Visual Basic 6 is strictly a 32-bit development environment. It was created during an era when 64-bit computing was not mainstream for desktop applications. Consequently, VB6 cannot create or directly interact with 64-bit components or processes. This limitation dictates that only the 32-bit (x86) version of the TVFMediaPlayer ActiveX control can be used with VB6. While modern systems are predominantly 64-bit, Windows maintains compatibility layers (WoW64 - Windows 32-bit on Windows 64-bit) that allow 32-bit applications like those built with VB6, and the 32-bit ActiveX controls they use, to run correctly on 64-bit operating systems. Despite being confined to a 32-bit architecture, the TVFMediaPlayer library is optimized to deliver robust and reliable performance. Developers can confidently build sophisticated multimedia applications in VB6, leveraging the full feature set provided by the 32-bit control. ## Prerequisites Before you begin the installation process, ensure you have the following: 1. **Microsoft Visual Basic 6:** A working installation of the VB6 IDE is required. This includes the necessary service packs (typically SP6). 2. **SDK:** Download the latest version of the SDK that includes the ActiveX components. Ensure you download the installer appropriate for your needs (often a combined x86/x64 installer, but only the x86 components will be registered for VB6 use). 3. **Administrator Privileges:** Installing the SDK and registering the ActiveX control typically requires administrator rights on the development machine. ## Step-by-Step Installation and Integration Follow these steps to integrate the TVFMediaPlayer control into your Visual Basic 6 project: ### **Step 1: Install the TVFMediaPlayer control** Run the downloaded VisioForge SDK installer. Follow the on-screen prompts. The installer will copy the necessary library files (`.ocx`, `.dll`) to your system and attempt to register the ActiveX control in the Windows Registry. Pay attention to the installation directory, though typically the registration process makes the control available system-wide. ### **Step 2: Create or Open a VB6 Project** Launch the Visual Basic 6 IDE. You can either start a new Standard EXE project or open an existing one where you wish to add multimedia capabilities. ![screenshot 1](mpvb6_1.webp) *Caption: Creating a new Standard EXE project in Visual Basic 6.* ### **Step 3: Add the TVFMediaPlayer Component** To make the ActiveX control available in your project's Toolbox, you need to add it through the "Components" dialog. * Go to the `Project` menu and select `Components...`. Alternatively, right-click on the Toolbox and choose `Components...`. ![screenshot 2](mpvb6_2.webp) *Caption: Accessing the Components dialog from the Project menu.* * The "Components" dialog lists all registered ActiveX controls on your system. Scroll down the list under the "Controls" tab. * Locate and check the box next to "VisioForge Media Player" (the exact name might vary slightly depending on the installed version). ![screenshot 3](mpvb6_3.webp) *Caption: Selecting the 'VisioForge Media Player' control in the Components dialog.* * Click `OK` or `Apply`. ### **Step 4: Use the Control in Your Project** After adding the component, its icon will appear in the VB6 Toolbox. ![screenshot 4](mpvb6_4.webp) *Caption: The TVFMediaPlayer control added to the Visual Basic 6 Toolbox.* You can now select the TVFMediaPlayer icon from the Toolbox and draw it onto any form in your project, just like any standard VB6 control (e.g., Button, TextBox). This creates an instance of the media player object on your form. You can resize and position it as needed using the form designer. #### **Basic Usage: Controlling the Player** Once the TVFMediaPlayer control (`VFMediaPlayer1` by default, if it's the first one added) is on your form, you can interact with it programmatically using VB6 code. ## Deployment Considerations When you distribute your VB6 application that uses the TVFMediaPlayer control, you must ensure the necessary runtime files are included and correctly registered on the target user's machine. 1. **Required Files:** Identify the specific `.ocx` file for the TVFMediaPlayer control and any dependent `.dll` files provided by the VisioForge SDK. These files need to be shipped with your application installer. 2. **Registration:** The ActiveX control (`.ocx` file) must be registered in the Windows Registry on the target machine. Standard installer tools (like Inno Setup, InstallShield, or even older VB6 packaging tools) usually provide mechanisms to register ActiveX controls during installation. Alternatively, the `regsvr32.exe` command-line utility can be used manually or via a script: ```bash regsvr32.exe "C:\\Program Files (x86)\\YourApp\\VFMediaPlayer.ocx" ``` Remember to use the correct path and run the command with administrator privileges. Since it's a 32-bit control, even on a 64-bit system, you typically use the `regsvr32.exe` found in the `C:\Windows\SysWOW64` directory, although the system often handles this redirection automatically. 3. **Licensing:** Ensure you comply with the VisioForge licensing terms for deployment. Some versions might require a runtime license key to be set programmatically within your application. ## Troubleshooting Common Issues * **Control Not Appearing in Components:** * Ensure the VisioForge SDK was installed correctly with administrator rights. * Try manually registering the `.ocx` file using `regsvr32.exe` from an elevated command prompt. * Verify you are looking for the correct name in the Components list. * **"Runtime Error '429': ActiveX component can't create object":** * This usually indicates the control is not properly registered on the machine where the application is running. Re-register the `.ocx` file. * Ensure all dependent DLLs are present in the application's directory or a system path. * **Playback Issues (No Video/Audio, Errors):** * Verify the path to the media file is correct and accessible. * Ensure necessary codecs are installed on the system (though TVFMediaPlayer often includes internal decoders or uses DirectShow/Media Foundation). * Check the VisioForge documentation for specific error codes or properties that might give more detail. * Implement proper error handling around player methods (`Play`, `Stop`, property setting) to diagnose issues. ## Beyond VB6: Modernization While TVFMediaPlayer provides a bridge for adding modern multimedia features to legacy VB6 applications, organizations should also consider long-term strategies. Migrating VB6 applications to newer platforms like .NET (using C# or VB.NET) or web-based technologies can offer significant advantages in terms of performance, security, maintainability, and access to the latest development tools and libraries. VisioForge also offers .NET-native versions of its libraries, which would be the preferred choice in a modernized application. ## Conclusion The TVFMediaPlayer library, through its ActiveX control, offers a powerful and accessible way for Visual Basic 6 developers to incorporate advanced multimedia functionalities into their applications. By understanding the installation process, the 32-bit limitations, basic control usage, and deployment requirements outlined in this guide, developers can effectively leverage VisioForge technology to enhance their VB6 projects. While VB6 is a legacy platform, tools like TVFMediaPlayer help extend its useful life for specific application needs. --- For further assistance or more complex scenarios, please get in touch with [VisioForge support](https://support.visioforge.com/). Explore the extensive code samples available on the VisioForge [GitHub repository](https://github.com/visioforge/) for more advanced examples and techniques. ---END OF PAGE--- # Local File: .\delphi\mediaplayer\install\visual-studio.md --- title: Installing TVFMediaPlayer ActiveX in Visual Studio description: Learn how to integrate TVFMediaPlayer ActiveX control into Visual Studio projects with step-by-step instructions. Covers C++, C#, and VB.NET implementation, troubleshooting tips, and explains why migrating to native .NET SDK is recommended for modern development. sidebar_label: Visual Studio --- # Installing TVFMediaPlayer ActiveX in Visual Studio 2010 and Later This guide provides detailed instructions for integrating the VisioForge Media Player (`TVFMediaPlayer`) ActiveX control into your Microsoft Visual Studio projects (version 2010 and newer). We will cover the necessary steps for C++, C#, and Visual Basic .NET environments, explain the underlying mechanisms, and discuss important considerations, including why migrating to the native .NET SDK is highly recommended for modern development. ## Understanding ActiveX and its Role in Modern Development ActiveX, a technology developed by Microsoft, allows software components (controls) to interact with one another regardless of the language they were originally written in. It's based on the Component Object Model (COM). In the context of Visual Studio, ActiveX controls can be embedded within application forms to provide specific functionalities, such as media playback in the case of `TVFMediaPlayer`. While historically significant, ActiveX usage has declined, especially within the .NET ecosystem. Modern .NET frameworks offer more integrated, robust, and secure ways to incorporate UI components and functionality. However, legacy applications or specific interoperability scenarios might still necessitate the use of ActiveX controls. When you use an ActiveX control in a .NET project (C# or VB.Net), Visual Studio doesn't interact with it directly. Instead, it automatically generates **Runtime Callable Wrappers (RCW)**. These wrappers are essentially .NET assemblies that act as intermediaries, translating .NET calls into COM calls that the ActiveX control understands, and vice versa. This process allows managed (.NET) code to utilize unmanaged (COM/ActiveX) components. ## Prerequisites Before you begin, ensure you have the following: 1. **Microsoft Visual Studio:** Version 2010 or a later edition installed. 2. **TVFMediaPlayer ActiveX Control:** The VisioForge Media Player ActiveX control must be properly installed and registered on your development machine. You can typically download this from the VisioForge website or distributor. **Crucially**, you might need both the 32-bit (x86) and 64-bit (x64) versions registered, even if you are only developing a 64-bit application. Visual Studio's designer often runs as a 32-bit process and requires the x86 version to display the control visually during design time. The runtime will use the version corresponding to your project's target architecture (x86 or x64). 3. **Project:** An existing or new C++, C#, or VB.NET project where you intend to use the media player. ## Step-by-Step Installation in Visual Studio The process involves adding the `TVFMediaPlayer` control to the Visual Studio Toolbox, which then allows you to drag and drop it onto your application's forms or windows. ### **Step 1: Create or Open Your Project** Launch Visual Studio and create a new project or open an existing one. The example screenshots below use a C# Windows Forms application, but the steps are analogous for C++ (MFC, perhaps) and VB.NET WinForms. * For C# WinForms: `File -> New -> Project -> Visual C# -> Windows Forms App (.NET Framework)` * For VB.NET WinForms: `File -> New -> Project -> Visual Basic -> Windows Forms App (.NET Framework)` * For C++ MFC: `File -> New -> Project -> Visual C++ -> MFC/ATL -> MFC App` ![Create New C# WinForms Project](mpvs2003_1.webp) ![Empty WinForms Designer](mpvs2003_11.webp) ### **Step 2: Open the Toolbox** If the Toolbox is not visible, you can open it via the `View` menu (`View -> Toolbox` or `Ctrl+Alt+X`). The Toolbox contains standard UI controls and components. ### **Step 3: Add the ActiveX Control to the Toolbox** To make the `TVFMediaPlayer` control available, you need to add it to the Toolbox: 1. Right-click within an empty area of the Toolbox (e.g., under the "General" tab or create a new tab). 2. Select "Choose Items..." from the context menu. ![Choose Items Menu in Toolbox](mpvs2003_2.webp) ### **Step 4: Select the TVFMediaPlayer Control** 1. The "Choose Toolbox Items" dialog will appear. Navigate to the "COM Components" tab. This tab lists all registered ActiveX controls on your system. 2. Scroll through the list or use the filter box to find the "VisioForge Media Player" control (the exact name might vary slightly based on the installed version). 3. Check the checkbox next to the control's name. 4. Click "OK". ![Selecting VisioForge Media Player in COM Components](mpvs2003_3.webp) Visual Studio will now add the control to your Toolbox and, if you are in a C# or VB.Net project, it will generate the necessary RCW assemblies (often named `AxInterop.VisioForgeMediaPlayerLib.dll` and `Interop.VisioForgeMediaPlayerLib.dll`) and add references to them in your project. ### **Step 5: Add the Control to Your Form** 1. Locate the newly added "VisioForge Media Player" icon in the Toolbox. 2. Click and drag the icon onto your application's form or design surface. ![Dragging Control from Toolbox to Form](mpvs2003_40.webp) An instance of the `TVFMediaPlayer` control will appear on your form. You can resize and position it as needed using the designer. ![Media Player Control Added to Form](mpvs2003_41.webp) ### **Step 6: Interacting with the Control (Code)** You can now interact with the media player control programmatically through its properties, methods, and events. Select the control in the designer, and use the Properties window (`F4`) to configure its appearance and basic behavior. To control playback, handle events, etc., you'll write code. Here's a simple C# example to load and play a video file when a button is clicked: ```csharp // Assuming your TVFMediaPlayer control is named 'axMediaPlayer1' // and you have a button named 'buttonPlay' private void buttonPlay_Click(object sender, EventArgs e) { // Prompt user to select a video file OpenFileDialog openFileDialog = new OpenFileDialog(); openFileDialog.Filter = "Media Files|*.mp4;*.avi;*.mov;*.wmv|All Files|*.*"; if (openFileDialog.ShowDialog() == DialogResult.OK) { try { // Set the filename for the ActiveX control axMediaPlayer1.Filename = openFileDialog.FileName; // Start playback axMediaPlayer1.Play(); } catch (Exception ex) { MessageBox.Show($"Error playing file: {ex.Message}"); } } } // Example of handling an event (e.g., playback completed) private void axMediaPlayer1_OnStop(object sender, EventArgs e) { MessageBox.Show("Playback stopped or finished."); } // Remember to attach the event handler, usually in the Form's Load event or constructor public Form1() { InitializeComponent(); axMediaPlayer1.OnStop += axMediaPlayer1_OnStop; // Attach the event handler } ``` Similar code can be written in VB.NET, accessing the same properties (`Filename`, `Play()`) and events (`OnStop`). In C++, you would typically use COM interfaces directly or MFC wrappers if using that framework. ## Important: The Case for the Native .NET SDK While the steps above show how to use the ActiveX control, **for all new .NET development (C#, VB.NET), we strongly recommend using the native VisioForge Media Player SDK for .NET.** The ActiveX approach, while functional, carries several significant disadvantages in the modern .NET world: 1. **Complexity:** Relies on COM Interop and RCW generation, adding layers of abstraction that can sometimes be fragile or lead to unexpected behavior. 2. **Performance:** COM Interop can introduce performance overhead compared to native .NET code. 3. **Deployment:** Requires proper registration of the ActiveX control (x86 and potentially x64) on the end-user's machine using `regsvr32`, which can complicate deployment and require administrative privileges. Native .NET libraries are typically deployed just by copying files (XCopy deployment) or via NuGet. 4. **Limited Integration:** ActiveX controls don't integrate as seamlessly with modern .NET UI frameworks like WPF or MAUI. While they can sometimes be hosted, it's often awkward and limited compared to native controls. 5. **Bitnes Mismatches:** Managing x86/x64 versions and ensuring the correct one is used by the application and the VS designer can be error-prone. 6. **Technology Age:** ActiveX is a legacy technology with limited ongoing evolution compared to the rapidly advancing .NET platform. **Advantages of the Native .NET SDK:** * **Native Controls:** Provides dedicated, optimized controls for WinForms, WPF, and MAUI. * **Full .NET Integration:** Leverages the full power of the .NET framework, including async/await, LINQ, modern event patterns, and easier data binding. * **Simplified Deployment:** Usually involves just referencing the SDK assemblies or NuGet packages. No COM registration needed. * **Enhanced Features:** Often includes more features, better performance, and more granular control than the corresponding ActiveX version. * **Improved Stability & Maintainability:** Native code is generally easier to debug, maintain, and less prone to interop issues. * **Future-Proofing:** Aligns your application with modern .NET development practices. You can find the native [.Net version of the SDK here](https://www.visioforge.com/media-player-sdk-net). It offers a significantly superior development experience and results for .NET applications. ## Troubleshooting Common Issues * **Control Not Appearing in "COM Components":** Ensure the `TVFMediaPlayer` ActiveX control is correctly installed and registered. Try running the registration command (`regsvr32 `) manually as an administrator. Remember to register both x86 and x64 versions if available and needed. * **Error Adding Control to Form:** This often points to a mismatch between the Visual Studio designer process (usually x86) and the registered control version. Make sure the x86 version is registered. * **Runtime Errors (File Not Found, Class Not Registered):** Verify the control (correct bitness for your app's target) is registered on the target machine where the application is run. Check project references to ensure the Interop assemblies are correctly included. * **Events Not Firing:** Double-check that event handlers are correctly attached to the control's events in your code. ## Conclusion Integrating the `TVFMediaPlayer` ActiveX control into Visual Studio 2010+ is achievable by adding it via the "Choose Toolbox Items" dialog. Visual Studio handles the generation of wrapper assemblies for .NET projects, allowing interaction via standard properties, methods, and events. However, due to the complexities, limitations, and deployment challenges associated with ActiveX/COM Interop in the .NET environment, **it is strongly advised to use the native VisioForge Media Player SDK for .NET for any new WinForms, WPF, or MAUI development.** The native SDK provides a more robust, performant, and developer-friendly experience aligned with modern application development practices. --- Need further assistance? Please contact [VisioForge Support](https://support.visioforge.com/) or explore more examples on our [GitHub](https://github.com/visioforge/) page. ---END OF PAGE--- # Local File: .\delphi\videocapture\audio-capture-mp3.md --- title: Audio Capture to MP3 Files in Delphi, C++ MFC & VB6 description: Complete step-by-step guide for developers implementing MP3 audio capture functionality in Delphi, C++ MFC, and VB6 applications. Learn how to configure LAME encoder settings, manage channels, handle bitrates and create high-quality audio recordings. sidebar_label: Audio capture to MP3 file --- # Audio Capture to MP3 Files in Delphi, C++ MFC & VB6 ## Introduction Audio capture capabilities are essential for many modern applications, from voice recording tools to multimedia creation software. This guide walks through the implementation of MP3 audio capture functionality in Delphi, C++ MFC, and VB6 applications using the VideoCapture component. MP3 remains one of the most widely used audio formats due to its excellent compression and broad compatibility. By implementing proper MP3 audio capture in your applications, you can provide users with efficient, high-quality audio recording capabilities. ## Prerequisites Before implementing MP3 audio capture, ensure you have: - Development environment with Delphi, Visual C++ (for MFC), or Visual Basic 6 - VideoCapture component properly installed and referenced in your project - Basic understanding of audio encoding concepts - Required permissions for audio device access in your application ## LAME Encoder Configuration The LAME MP3 encoder provides extensive customization options for audio quality, bitrate management, and channel configuration. Properly configuring these settings is crucial for achieving the desired audio quality while managing file size. ### Configuring Basic Encoding Parameters The following code snippets demonstrate how to configure basic LAME encoding parameters: ```pascal // Delphi VideoCapture1.Audio_LAME_CBR_Bitrate := StrToInt(cbLameCBRBitrate.Items[cbLameCBRBitrate.ItemIndex]); VideoCapture1.Audio_LAME_VBR_Min_Bitrate := StrToInt(cbLameVBRMin.Items[cbLameVBRMin.ItemIndex]); VideoCapture1.Audio_LAME_VBR_Max_Bitrate := StrToInt(cbLameVBRMax.Items[cbLameVBRMax.ItemIndex]); VideoCapture1.Audio_LAME_Sample_Rate := StrToInt(cbLameSampleRate.Items[cbLameSampleRate.ItemIndex]); VideoCapture1.Audio_LAME_VBR_Quality := tbLameVBRQuality.Position; VideoCapture1.Audio_LAME_Encoding_Quality := tbLameEncodingQuality.Position; ``` ```cpp // C++ MFC m_VideoCapture.Audio_LAME_CBR_Bitrate = _ttoi(m_cbLameCBRBitrate.GetItemData(m_cbLameCBRBitrate.GetCurSel())); m_VideoCapture.Audio_LAME_VBR_Min_Bitrate = _ttoi(m_cbLameVBRMin.GetItemData(m_cbLameVBRMin.GetCurSel())); m_VideoCapture.Audio_LAME_VBR_Max_Bitrate = _ttoi(m_cbLameVBRMax.GetItemData(m_cbLameVBRMax.GetCurSel())); m_VideoCapture.Audio_LAME_Sample_Rate = _ttoi(m_cbLameSampleRate.GetItemData(m_cbLameSampleRate.GetCurSel())); m_VideoCapture.Audio_LAME_VBR_Quality = m_tbLameVBRQuality.GetPos(); m_VideoCapture.Audio_LAME_Encoding_Quality = m_tbLameEncodingQuality.GetPos(); ``` ```vb ' VB6 VideoCapture1.Audio_LAME_CBR_Bitrate = CInt(cbLameCBRBitrate.List(cbLameCBRBitrate.ListIndex)) VideoCapture1.Audio_LAME_VBR_Min_Bitrate = CInt(cbLameVBRMin.List(cbLameVBRMin.ListIndex)) VideoCapture1.Audio_LAME_VBR_Max_Bitrate = CInt(cbLameVBRMax.List(cbLameVBRMax.ListIndex)) VideoCapture1.Audio_LAME_Sample_Rate = CInt(cbLameSampleRate.List(cbLameSampleRate.ListIndex)) VideoCapture1.Audio_LAME_VBR_Quality = tbLameVBRQuality.Value VideoCapture1.Audio_LAME_Encoding_Quality = tbLameEncodingQuality.Value ``` ### Setting Audio Channel Modes Channel configuration affects both sound quality and file size. The following code demonstrates how to set the channel mode: ```pascal // Delphi if rbLameStandardStereo.Checked then VideoCapture1.Audio_LAME_Channels_Mode := CH_Standard_Stereo else if rbLameJointStereo.Checked then VideoCapture1.Audio_LAME_Channels_Mode := CH_Joint_Stereo else if rbLameDualChannels.Checked then VideoCapture1.Audio_LAME_Channels_Mode := CH_Dual_Stereo else VideoCapture1.Audio_LAME_Channels_Mode := CH_Mono; ``` ```cpp // C++ MFC if (m_rbLameStandardStereo.GetCheck()) m_VideoCapture.Audio_LAME_Channels_Mode = VisioForge_Video_Capture::CH_Standard_Stereo; else if (m_rbLameJointStereo.GetCheck()) m_VideoCapture.Audio_LAME_Channels_Mode = VisioForge_Video_Capture::CH_Joint_Stereo; else if (m_rbLameDualChannels.GetCheck()) m_VideoCapture.Audio_LAME_Channels_Mode = VisioForge_Video_Capture::CH_Dual_Stereo; else m_VideoCapture.Audio_LAME_Channels_Mode = VisioForge_Video_Capture::CH_Mono; ``` ```vb ' VB6 If rbLameStandardStereo.Value Then VideoCapture1.Audio_LAME_Channels_Mode = CH_Standard_Stereo ElseIf rbLameJointStereo.Value Then VideoCapture1.Audio_LAME_Channels_Mode = CH_Joint_Stereo ElseIf rbLameDualChannels.Value Then VideoCapture1.Audio_LAME_Channels_Mode = CH_Dual_Stereo Else VideoCapture1.Audio_LAME_Channels_Mode = CH_Mono End If ``` ### Advanced LAME Configuration Options For more precise control over the encoding process, configure these advanced LAME options: ```pascal // Delphi VideoCapture1.Audio_LAME_VBR_Mode := rbLameVBR.Checked; VideoCapture1.Audio_LAME_Copyright := cbLameCopyright.Checked; VideoCapture1.Audio_LAME_Original := cbLameOriginalCopy.Checked; VideoCapture1.Audio_LAME_CRC_Protected := cbLameCRCProtected.Checked; VideoCapture1.Audio_LAME_Force_Mono := cbLameForceMono.Checked; VideoCapture1.Audio_LAME_Strictly_Enforce_VBR_Min_Bitrate := cbLameStrictlyEnforceVBRMinBitrate.Checked; VideoCapture1.Audio_LAME_Voice_Encoding_Mode := cbLameVoiceEncodingMode.Checked; VideoCapture1.Audio_LAME_Keep_All_Frequencies := cbLameKeepAllFrequencies.Checked; VideoCapture1.Audio_LAME_Strict_ISO_Compilance := cbLameStrictISOCompilance.Checked; VideoCapture1.Audio_LAME_Disable_Short_Blocks := cbLameDisableShortBlocks.Checked; VideoCapture1.Audio_LAME_Enable_Xing_VBR_Tag := cbLameEnableXingVBRTag.Checked; VideoCapture1.Audio_LAME_Mode_Fixed := cbLameModeFixed.Checked; ``` ```cpp // C++ MFC m_VideoCapture.Audio_LAME_VBR_Mode = m_rbLameVBR.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Copyright = m_cbLameCopyright.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Original = m_cbLameOriginalCopy.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_CRC_Protected = m_cbLameCRCProtected.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Force_Mono = m_cbLameForceMono.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Strictly_Enforce_VBR_Min_Bitrate = m_cbLameStrictlyEnforceVBRMinBitrate.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Voice_Encoding_Mode = m_cbLameVoiceEncodingMode.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Keep_All_Frequencies = m_cbLameKeepAllFrequencies.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Strict_ISO_Compilance = m_cbLameStrictISOCompilance.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Disable_Short_Blocks = m_cbLameDisableShortBlocks.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Enable_Xing_VBR_Tag = m_cbLameEnableXingVBRTag.GetCheck() ? true : false; m_VideoCapture.Audio_LAME_Mode_Fixed = m_cbLameModeFixed.GetCheck() ? true : false; ``` ```vb ' VB6 VideoCapture1.Audio_LAME_VBR_Mode = rbLameVBR.Value VideoCapture1.Audio_LAME_Copyright = cbLameCopyright.Value VideoCapture1.Audio_LAME_Original = cbLameOriginalCopy.Value VideoCapture1.Audio_LAME_CRC_Protected = cbLameCRCProtected.Value VideoCapture1.Audio_LAME_Force_Mono = cbLameForceMono.Value VideoCapture1.Audio_LAME_Strictly_Enforce_VBR_Min_Bitrate = cbLameStrictlyEnforceVBRMinBitrate.Value VideoCapture1.Audio_LAME_Voice_Encoding_Mode = cbLameVoiceEncodingMode.Value VideoCapture1.Audio_LAME_Keep_All_Frequencies = cbLameKeepAllFrequencies.Value VideoCapture1.Audio_LAME_Strict_ISO_Compilance = cbLameStrictISOCompilance.Value VideoCapture1.Audio_LAME_Disable_Short_Blocks = cbLameDisableShortBlocks.Value VideoCapture1.Audio_LAME_Enable_Xing_VBR_Tag = cbLameEnableXingVBRTag.Value VideoCapture1.Audio_LAME_Mode_Fixed = cbLameModeFixed.Value ``` ## Understanding LAME Configuration Options ### Bitrate Settings - **CBR (Constant Bitrate)**: Maintains the same bitrate throughout the entire recording - **VBR (Variable Bitrate)**: Adjusts bitrate based on audio complexity - **Min/Max Bitrate**: Sets boundaries for VBR encoding - **VBR Quality**: Controls the quality/file size balance in VBR mode ### Channel Modes - **Standard Stereo**: Completely separate left and right channels - **Joint Stereo**: Combines redundant information between channels to save space - **Dual Stereo**: Two completely independent mono channels - **Mono**: Single audio channel ### Special Encoding Options - **Voice Encoding Mode**: Optimizes encoding for voice frequencies - **Force Mono**: Converts stereo input to mono output - **CRC Protection**: Adds error detection data - **Strict ISO Compliance**: Ensures maximum compatibility with all MP3 players ## Configuring Output Format After setting up LAME encoding parameters, specify MP3 as the output format: ```pascal // Delphi VideoCapture1.OutputFormat := Format_LAME; ``` ```cpp // C++ MFC m_VideoCapture.OutputFormat = VisioForge_Video_Capture::Format_LAME; ``` ```vb ' VB6 VideoCapture1.OutputFormat = Format_LAME ``` ## Setting Audio Capture Mode Set the VideoCapture component to audio-only capture mode: ```pascal // Delphi VideoCapture1.Mode := Mode_Audio_Capture; ``` ```cpp // C++ MFC m_VideoCapture.Mode = VisioForge_Video_Capture::Mode_Audio_Capture; ``` ```vb ' VB6 VideoCapture1.Mode = Mode_Audio_Capture ``` ## Starting the Audio Capture Once all parameters are configured, initiate the recording process: ```pascal // Delphi VideoCapture1.Start; ``` ```cpp // C++ MFC m_VideoCapture.Start(); ``` ```vb ' VB6 VideoCapture1.Start ``` ## Best Practices for MP3 Audio Capture - **Quality vs. Size**: For voice recordings, lower bitrates (64-128 kbps) are usually sufficient. For music, use 192 kbps or higher. - **Sample Rate Selection**: 44.1 kHz is standard for most audio. Lower rates can be used for voice-only recordings. - **VBR vs. CBR**: VBR generally provides better quality-to-size ratio but might have compatibility issues with some players. - **Error Handling**: Always implement proper error handling around the recording process. - **User Feedback**: Provide visual feedback during recording (level meters, time elapsed). ## Conclusion Implementing MP3 audio capture in your applications provides users with a widely compatible and efficient recording solution. By properly configuring LAME encoder settings, you can balance audio quality and file size based on your application's specific requirements. The VideoCapture component makes this implementation straightforward in Delphi, C++ MFC, and VB6 applications, allowing you to focus on creating a great user experience around the audio capture functionality. --- For additional code samples and advanced implementation techniques, visit our GitHub repository. If you encounter any issues during implementation, contact our technical support team for assistance. ---END OF PAGE--- # Local File: .\delphi\videocapture\audio-capture-wav.md --- title: Delphi WAV Audio Capture Implementation Guide description: Master WAV file audio capture in Delphi applications with this developer tutorial. Learn codec selection, audio parameter configuration, and implementation techniques for creating high-quality audio recording functionality. Includes sample code for mono/stereo recording, compression options, and troubleshooting tips. sidebar_label: Audio capture to WAV file --- # Audio Capture to WAV Files: Developer Implementation Guide ## Introduction Capturing audio to WAV files is a fundamental requirement for many multimedia applications. This guide provides detailed instructions for implementing audio capture functionality with or without compression in your applications. Whether you're developing in Delphi, C++ MFC, or VB6 using our ActiveX controls, this guide will walk you through the entire process from initial setup to final implementation. ## Setting Up Your Development Environment Before you begin implementing audio capture, ensure you have: 1. Installed the SDK in your development environment 2. Added the VideoCapture component to your form/project 3. Set up basic error handling to manage capture exceptions 4. Configured your application to access audio hardware ## Audio Codec Management ### Retrieving Available Audio Codecs The first step in implementing audio capture is to retrieve a list of available audio codecs on the system. This allows you to present users with codec options or to programmatically select the most appropriate codec for your application's needs. #### Delphi Implementation ```pascal // Iterate through all available audio codecs for i := 0 to VideoCapture1.Audio_Codecs_GetCount - 1 do cbAudioCodec.Items.Add(VideoCapture1.Audio_Codecs_GetItem(i)); ``` #### C++ MFC Implementation ```cpp // Get all available audio codecs and populate combo box for (int i = 0; i < m_VideoCapture.Audio_Codecs_GetCount(); i++) { CString codec = m_VideoCapture.Audio_Codecs_GetItem(i); m_AudioCodecCombo.AddString(codec); } ``` #### VB6 Implementation ```vb ' Iterate through all available audio codecs For i = 0 To VideoCapture1.Audio_Codecs_GetCount - 1 cboAudioCodec.AddItem VideoCapture1.Audio_Codecs_GetItem(i) Next i ``` ### Selecting an Audio Codec Once you've populated the list of available codecs, you'll need to provide a way to select the desired codec for the audio capture operation. This can be done programmatically or via user selection. #### Delphi Implementation ```pascal // Set the codec based on user selection from combo box VideoCapture1.Audio_Codec := cbAudioCodec.Items[cbAudioCodec.ItemIndex]; ``` #### C++ MFC Implementation ```cpp // Get the selected codec from the combo box int selectedIndex = m_AudioCodecCombo.GetCurSel(); CString selectedCodec; m_AudioCodecCombo.GetLBText(selectedIndex, selectedCodec); // Set the codec m_VideoCapture.SetAudio_Codec(selectedCodec); ``` #### VB6 Implementation ```vb ' Set the codec based on user selection VideoCapture1.Audio_Codec = cboAudioCodec.Text ``` ## Configuring Audio Parameters Proper audio parameter configuration is crucial for achieving the desired quality and file size balance. The three primary parameters to configure are channels, bits per sample (BPS), and sample rate. ### Setting Audio Channels Audio channels determine whether the captured audio is mono (1 channel) or stereo (2 channels). Stereo provides better spatial audio representation but requires more storage space. #### Delphi Implementation ```pascal // Set the number of audio channels (1 for mono, 2 for stereo) VideoCapture1.Audio_Channels := StrToInt(cbChannels2.Items[cbChannels2.ItemIndex]); ``` #### C++ MFC Implementation ```cpp // Set audio channels (1 for mono, 2 for stereo) int channels = _ttoi(m_ChannelsCombo.GetSelectedItem()); m_VideoCapture.SetAudio_Channels(channels); ``` #### VB6 Implementation ```vb ' Set audio channels (1 for mono, 2 for stereo) VideoCapture1.Audio_Channels = CInt(cboChannels.Text) ``` ### Configuring Bits Per Sample (BPS) The bits per sample (BPS) setting affects the dynamic range and quality of the audio. Common values include 8, 16, and 24 bits, with higher values providing better quality at the cost of larger file sizes. #### Delphi Implementation ```pascal // Set bits per sample (typically 8, 16, or 24) VideoCapture1.Audio_BPS := StrToInt(cbBPS2.Items[cbBPS2.ItemIndex]); ``` #### C++ MFC Implementation ```cpp // Set bits per sample int bps = _ttoi(m_BPSCombo.GetSelectedItem()); m_VideoCapture.SetAudio_BPS(bps); ``` #### VB6 Implementation ```vb ' Set bits per sample VideoCapture1.Audio_BPS = CInt(cboBPS.Text) ``` ### Setting Sample Rate The sample rate determines how many audio samples are captured per second. Common values include 8000 Hz, 44100 Hz (CD quality), and 48000 Hz (professional audio). Higher sample rates capture more high-frequency detail but increase file size. #### Delphi Implementation ```pascal // Set audio sample rate in Hz (common values: 8000, 44100, 48000) VideoCapture1.Audio_SampleRate := StrToInt(cbSamplerate.Items[cbSamplerate.ItemIndex]); ``` #### C++ MFC Implementation ```cpp // Set sample rate int sampleRate = _ttoi(m_SampleRateCombo.GetSelectedItem()); m_VideoCapture.SetAudio_SampleRate(sampleRate); ``` #### VB6 Implementation ```vb ' Set sample rate VideoCapture1.Audio_SampleRate = CInt(cboSampleRate.Text) ``` ## Configuring Output Format ### Selecting PCM/ACM Format The Windows Audio Compression Manager (ACM) supports various audio formats including PCM (uncompressed) and compressed formats. Setting the output format to PCM/ACM enables codec-based compression when a codec other than PCM is selected. #### Delphi Implementation ```pascal // Set output to PCM/ACM format to enable codec-based compression VideoCapture1.OutputFormat := Format_PCM_ACM; ``` #### C++ MFC Implementation ```cpp // Set output format to PCM/ACM m_VideoCapture.SetOutputFormat(Format_PCM_ACM); ``` #### VB6 Implementation ```vb ' Set output format to PCM/ACM VideoCapture1.OutputFormat = Format_PCM_ACM ``` ## Setting the Audio Capture Mode Before starting the capture operation, you need to set the component to audio capture mode. This ensures that only audio is captured without any video streams. ### Delphi Implementation ```pascal // Set to audio-only capture mode VideoCapture1.Mode := Mode_Audio_Capture; ``` ### C++ MFC Implementation ```cpp // Set to audio-only capture mode m_VideoCapture.SetMode(Mode_Audio_Capture); ``` ### VB6 Implementation ```vb ' Set to audio-only capture mode VideoCapture1.Mode = Mode_Audio_Capture ``` ## Starting the Audio Capture With all parameters configured, you can now start the audio capture process. This initializes the audio hardware, applies the selected codec and settings, and begins capturing audio to the specified output file. ### Delphi Implementation ```pascal // Begin audio capture process VideoCapture1.Start; ``` ### C++ MFC Implementation ```cpp // Begin audio capture process m_VideoCapture.Start(); ``` ### VB6 Implementation ```vb ' Begin audio capture process VideoCapture1.Start ``` ## Advanced Implementation Considerations ### User Interface Integration To provide a better user experience, consider implementing: 1. Real-time audio level metering 2. Elapsed time display 3. File size estimation 4. Pause/resume functionality ### Performance Optimization For optimal performance when capturing extended audio sessions: 1. Monitor system memory usage 2. Implement file splitting for long recordings 3. Consider buffering strategies for high-quality captures ## Troubleshooting Common Issues When implementing audio capture, you might encounter these common issues: 1. **No audio devices detected**: Ensure proper hardware connections and drivers 2. **Poor audio quality**: Verify sample rate and bits per sample settings 3. **Codec compatibility issues**: Test with standard codecs like PCM or MP3 4. **High CPU usage**: Consider reducing sample rate or using hardware acceleration ## Conclusion Implementing audio capture to WAV files in your applications requires careful configuration of codecs, audio parameters, and output settings. By following this guide, you can create robust audio capture functionality that balances quality and file size requirements. For complex implementations or specific technical challenges, our support team is available to assist with custom solutions tailored to your application requirements. ## Additional Resources Visit our GitHub page for more code samples and implementation examples that demonstrate advanced audio capture techniques and integration patterns. --- For technical assistance with this implementation, please contact our support team. Additional code samples are available on our GitHub page. ---END OF PAGE--- # Local File: .\delphi\videocapture\audio-output.md --- title: Delphi Audio Output Device Selection | VideoCapture description: Implement audio output device selection in Delphi applications with this complete guide. Learn volume control, balance adjustment, and proper device configuration with practical code examples for Delphi, C++ MFC, and VB6 developers. sidebar_label: Selecting Audio Output Devices --- # Audio Output Device Selection Implementation Guide This guide provides detailed instructions and code examples for implementing audio output device selection in your video capture applications. Delphi, C++ MFC, and VB6 implementations are covered to help you integrate this functionality into your projects efficiently. ## Available Audio Output Device Enumeration The first step in implementing audio output device selection is retrieving the complete list of available audio output devices on the system. This allows users to choose their preferred audio output device. ### Delphi Implementation ```pascal // Iterate through all available audio output devices for i := 0 to VideoCapture1.Audio_OutputDevices_GetCount - 1 do // Add each device to the dropdown list cbAudioOutputDevice.Items.Add(VideoCapture1.Audio_OutputDevices_GetItem(i)); ``` ### C++ MFC Implementation ```cpp // Populate the combobox with all available audio output devices for (int i = 0; i < m_VideoCapture.Audio_OutputDevices_GetCount(); i++) { CString deviceName = m_VideoCapture.Audio_OutputDevices_GetItem(i); m_AudioOutputDeviceCombo.AddString(deviceName); } ``` ### VB6 Implementation ```vb ' Iterate through all available audio output devices For i = 0 To VideoCapture1.Audio_OutputDevices_GetCount - 1 ' Add each device to the dropdown list cboAudioOutputDevice.AddItem VideoCapture1.Audio_OutputDevices_GetItem(i) Next i ``` ## Setting the Active Audio Output Device After retrieving the available devices, the next step is to set the selected device as the active audio output device for your application. ### Delphi Implementation ```pascal // Set the selected device as the active audio output device VideoCapture1.Audio_OutputDevice := cbAudioOutputDevice.Items[cbAudioOutputDevice.ItemIndex]; ``` ### C++ MFC Implementation ```cpp // Get the selected index from the combobox int selectedIndex = m_AudioOutputDeviceCombo.GetCurSel(); CString selectedDevice; m_AudioOutputDeviceCombo.GetLBText(selectedIndex, selectedDevice); // Set the selected device as the active audio output device m_VideoCapture.Audio_OutputDevice = selectedDevice; ``` ### VB6 Implementation ```vb ' Set the selected device as the active audio output device VideoCapture1.Audio_OutputDevice = cboAudioOutputDevice.Text ``` ## Enabling Audio Playback Once the output device is selected, you need to enable audio playback to hear the audio through the selected device. ### Delphi Implementation ```pascal // Enable audio playback through the selected device VideoCapture1.Audio_PlayAudio := true; ``` ### C++ MFC Implementation ```cpp // Enable audio playback through the selected device m_VideoCapture.Audio_PlayAudio = TRUE; ``` ### VB6 Implementation ```vb ' Enable audio playback through the selected device VideoCapture1.Audio_PlayAudio = True ``` ## Adjusting Audio Volume Levels Providing volume control gives users the ability to customize their audio experience. This section shows how to implement volume adjustment. ### Delphi Implementation ```pascal // Set the volume level based on trackbar position VideoCapture1.Audio_OutputDevice_SetVolume(tbAudioVolume.Position); ``` ### C++ MFC Implementation ```cpp // Get the current position of the volume slider int volumeLevel = m_VolumeSlider.GetPos(); // Set the volume level based on slider position m_VideoCapture.Audio_OutputDevice_SetVolume(volumeLevel); ``` ### VB6 Implementation ```vb ' Set the volume level based on slider position VideoCapture1.Audio_OutputDevice_SetVolume sldVolume.Value ``` ## Controlling Audio Balance For stereo output, balance control allows users to adjust the relative volume between left and right channels. ### Delphi Implementation ```pascal // Set the balance level based on trackbar position VideoCapture1.Audio_OutputDevice_SetBalance(tbAudioBalance.Position); ``` ### C++ MFC Implementation ```cpp // Get the current position of the balance slider int balanceLevel = m_BalanceSlider.GetPos(); // Set the balance level based on slider position m_VideoCapture.Audio_OutputDevice_SetBalance(balanceLevel); ``` ### VB6 Implementation ```vb ' Set the balance level based on slider position VideoCapture1.Audio_OutputDevice_SetBalance sldBalance.Value ``` ## Best Practices for Audio Device Implementation - Always check if the audio device is valid before attempting to use it - Provide fallback mechanisms when the selected device becomes unavailable - Consider saving user preferences for audio device selection between sessions - Implement visual feedback when volume or balance settings are changed --- Please contact our [support team](https://support.visioforge.com/) if you need assistance with this implementation. Visit our [GitHub repository](https://github.com/visioforge/) for additional code samples and resources. ---END OF PAGE--- # Local File: .\delphi\videocapture\changelog.md --- title: TVFVideoCapture Library Version History description: Explore the complete evolution of our Delphi/ActiveX video capture library, featuring GPU acceleration, streaming capabilities, and format support. Track all updates from version 4.1 to 11.0 with detailed release notes and technical improvements. sidebar_label: Changelog --- # TVFVideoCapture Version History ## Release 11.00 - Enhanced GPU Encoding & Modern Delphi Support - **Expanded Framework Compatibility**: Added support for Delphi 10.4 and 11.0 development environments - **Advanced AMD GPU Acceleration**: Implemented MP4 (H264/AAC) video encoding utilizing AMD graphics processing units - **Intel GPU Hardware Encoding**: Added MP4 (H264/AAC) video encoding through Intel integrated and discrete GPUs - **NVIDIA CUDA Acceleration**: Introduced MP4 (H264/AAC) video encoding powered by NVIDIA graphics hardware - **Container Format Improvements**: Enhanced MKV output with optimized performance and reliability - **New Output Format**: Added MOV container format support for Apple ecosystem compatibility ## Release 10.0 - Performance Optimizations & Multi-Platform Support - **MP4 Enhancement**: Thoroughly updated and improved MP4 output capabilities - **Streaming Improvements**: Updated VLC source filter with enhanced RTMP and HTTPS support - **Memory Management**: Fixed critical CUDA encoder memory leak for stable long-duration encoding - **Resource Optimization**: Resolved FFMPEG source memory leak for improved application stability - **Audio Capture**: Enhanced What You Hear filter for superior system audio recording - **64-bit Architecture**: Added x64 VLC source for TVFMediaPlayer and TVFVideoCapture (both Delphi and ActiveX) - **Extended Format Support**: Enhanced YUV2RGB filter with HDYC format support - **Audio Encoding**: Updated LAME encoder with fix for low bitrate mono audio issues - **Development Environment**: Added Delphi 10, 10.1 support for modern development workflows ## Release 8.7 - Core Engine Updates - **VLC Integration**: Updated VLC engine to libVLC 2.2.1.0 for improved streaming capabilities - **Decoder Enhancement**: Updated FFMPEG engine for better format compatibility and performance ## Release 8.6 - Reliability Improvements & Format Support - **Resource Management**: Fixed critical memory leak for improved application stability - **File Handling**: Resolved issues with improperly closed input and output files - **New Format Support**: Added custom WebM filters based on the WebM project specifications ## Release 8.4 - Architecture Expansion - **Modern Delphi**: Added Delphi XE8 support for latest development environments - **64-bit Architecture**: Introduced Delphi and ActiveX x64 versions for performance on modern systems ## Release 8.31 - Development Environment Update - **Framework Compatibility**: Added Delphi XE7 support for expanded development options ## Release 8.3 - API and Performance Improvements - **Interface Enhancement**: Updated ActiveX API for improved developer experience - **Decoder Optimization**: Enhanced FFMPEG decoder for better performance and format support - **Stability**: Implemented several critical bug fixes and performance improvements ## Release 8.0 - Streaming Capabilities - **Network Streaming**: Introduced VLC engine for IP video capture capabilities - **Reliability**: Fixed several bugs for improved stability across all components ## Release 7.15 - Advanced Output Options & Security - **Network Capture**: Improved IP capture engine for better connection stability and performance - **Modern Format Support**: Added MP4 with H264/AAC output for industry-standard compatibility - **Security Feature**: Implemented video encryption for protected content workflows - **System Integration**: Added Virtual Camera output for software integration scenarios - **Stability**: Multiple small bug fixes for improved reliability ## Release 7.0 - Capture Engine Improvements - **Network Performance**: Enhanced IP capture engine with improved throughput and reliability - **Desktop Capture**: Updated screen capture engine for better performance and quality - **Output Options**: Enhanced FFMPEG output for expanded format support - **Visual Effects**: Added Pan/zoom video effect for advanced video manipulation - **Reliability**: Implemented multiple small bug fixes for improved stability ## Release 6.0 - Multi-Source & Windows 8 Compatibility - **Advanced Compositing**: Improved Picture-In-Picture with support for any video source including screen capture and IP cameras - **Streaming Protocol**: Enhanced RTSP sources support for better network video integration - **Special Capture Mode**: Added layered windows screen capture support for complex UI recording - **Hardware Support**: Implemented iCube cameras support for specialized imaging applications - **OS Compatibility**: Added Windows 8 Developer Preview support for forward compatibility - **Visual Processing**: Enhanced video effects with new options and improved performance - **Audio Management**: Introduced multiple audio stream support for AVI and WMV outputs ## Release 5.5 - Stability & Feature Enhancements - **Visual Processing**: Enhanced video effects with improved quality and performance - **Network Video**: Improved IP cameras support for better connectivity and compatibility - **Reliability**: Fixed several bugs for improved overall stability ## Release 5.4 - Modern Delphi Support - **Development Environment**: Added Delphi XE2 support for modern application development - **Stability**: Implemented several bug fixes for improved reliability ## Release 5.3 - Video Processing Improvements - **Visual Effects**: Enhanced video effects with additional options and better performance - **Network Video**: Improved IP cameras support for wider device compatibility - **Reliability**: Fixed multiple bugs for more stable operation ## Release 5.2 - Frame Processing Enhancements - **Visual Effects**: Improved video effects and video frame grabber functionality - **Stability**: Fixed several bugs for enhanced reliability ## Release 5.1 - Network Video & Effects Improvements - **IP Camera Integration**: Enhanced IP camera support for improved connectivity - **Visual Processing**: Improved video effect quality and performance - **Reliability**: Fixed various issues for better stability ## Release 5.0 - Major Format Support Expansion - **Network Video**: Added RTSP/HTTP IP camera support (MJPEG/MPEG-4/H264 with or without audio) - **Modern Format**: Implemented WebM output for open web standards compatibility - **Format Flexibility**: Added MPEG-1/2/4 and FLV output using FFMPEG integration ## Release 4.22 - Screen Capture Improvements - **Desktop Recording**: Fixed bugs in screen capture filter for improved recording quality ## Release 4.21 - Screen Capture Enhancements - **Desktop Recording**: Implemented multiple bug fixes and improvements in screen capture filter ## Release 4.2 - Audio Processing Improvement - **Sound Effects**: Enhanced audio effects filter with improved quality and performance ## Release 4.1 - Modern Delphi Integration - **Development Environment**: Added Delphi 2010 support for the Delphi edition - **Stability**: Fixed several bugs for improved reliability ---END OF PAGE--- # Local File: .\delphi\videocapture\custom-output.md --- title: DirectShow Output Formats in Delphi - Complete Guide description: Master DirectShow output formats implementation in Delphi, C++ MFC and VB6 applications. Step-by-step guide with practical code examples for integrating third-party filters, codecs, and multiplexers for video capture and processing sidebar_label: Custom output formats --- # Code sample - Custom output formats Delphi, C++ MFC, and VB6 sample code. Currently, there are several options for connecting third-party DirectShow filters to get the necessary format. ## The first option - 3 different DirectShow filters An audio codec, a video codec, and a multiplexer – different filter. You can use both DirectShow filters and regular codecs as codecs. ## The second option - an all-in-one DirectShow filter A multiplexer, a video codec, and an audio codec – the same filter. Another difference is whether the filter can write to a file itself, whether you should use the standard File Writer filter, or whether you need another special filter. In the first two cases, VisioForge Video Capture will detect it automatically and set the necessary parameters, but you have to specify the necessary filter yourself in the third case. Now, let us see what the code for different options looks like. ## First option Get lists of audio and video codecs ```pascal for I := 0 to VideoCapture1.Video_Codecs_GetCount - 1 do cbCustomVideoCodec.Items.Add(VideoCapture1.Video_Codecs_GetItem(i)); for I := 0 to VideoCapture1.Audio_Codecs_GetCount - 1 do cbCustomAudioCodec.Items.Add(VideoCapture1.Audio_Codecs_GetItem(i)); ``` ```cpp // C++ MFC for (int i = 0; i < m_VideoCapture.Video_Codecs_GetCount(); i++) m_CustomVideoCodecCombo.AddString(m_VideoCapture.Video_Codecs_GetItem(i)); for (int i = 0; i < m_VideoCapture.Audio_Codecs_GetCount(); i++) m_CustomAudioCodecCombo.AddString(m_VideoCapture.Audio_Codecs_GetItem(i)); ``` ```vb ' VB6 For i = 0 To VideoCapture1.Video_Codecs_GetCount - 1 cbCustomVideoCodec.AddItem VideoCapture1.Video_Codecs_GetItem(i) Next i For i = 0 To VideoCapture1.Audio_Codecs_GetCount - 1 cbCustomAudioCodec.AddItem VideoCapture1.Audio_Codecs_GetItem(i) Next i ``` Get the list of DirectShow filters ```pascal for I := 0 to VideoCapture1.DirectShow_Filters_GetCount - 1 do begin cbCustomDSFilterV.Items.Add(VideoCapture1.DirectShow_Filters_GetItem(i)); cbCustomDSFilterA.Items.Add(VideoCapture1.DirectShow_Filters_GetItem(i)); cbCustomMuxer.Items.Add(VideoCapture1.DirectShow_Filters_GetItem(i)); cbCustomFilewriter.Items.Add(VideoCapture1.DirectShow_Filters_GetItem(i)); end; ``` ```cpp // C++ MFC for (int i = 0; i < m_VideoCapture.DirectShow_Filters_GetCount(); i++) { m_CustomDSFilterVCombo.AddString(m_VideoCapture.DirectShow_Filters_GetItem(i)); m_CustomDSFilterACombo.AddString(m_VideoCapture.DirectShow_Filters_GetItem(i)); m_CustomMuxerCombo.AddString(m_VideoCapture.DirectShow_Filters_GetItem(i)); m_CustomFilewriterCombo.AddString(m_VideoCapture.DirectShow_Filters_GetItem(i)); } ``` ```vb ' VB6 For i = 0 To VideoCapture1.DirectShow_Filters_GetCount - 1 cbCustomDSFilterV.AddItem VideoCapture1.DirectShow_Filters_GetItem(i) cbCustomDSFilterA.AddItem VideoCapture1.DirectShow_Filters_GetItem(i) cbCustomMuxer.AddItem VideoCapture1.DirectShow_Filters_GetItem(i) cbCustomFilewriter.AddItem VideoCapture1.DirectShow_Filters_GetItem(i) Next i ``` Select filters and codecs ```pascal if rbCustomUseVideoCodecsCat.Checked then begin VideoCapture1.Custom_Output_Video_Codec := cbCustomVideoCodec.Items[cbCustomVideoCodec.ItemIndex]; VideoCapture1.Custom_Output_Video_Codec_Use_Filters_Category := false; end else begin VideoCapture1.Custom_Output_Video_Codec := cbCustomDSFilterV.Items[cbCustomDSFilterV.ItemIndex]; VideoCapture1.Custom_Output_Video_Codec_Use_Filters_Category := true; end; if rbCustomUseAudioCodecsCat.Checked then begin VideoCapture1.Custom_Output_Audio_Codec := cbCustomAudioCodec.Items[cbCustomAudioCodec.ItemIndex]; VideoCapture1.Custom_Output_Audio_Codec_Use_Filters_Category := false; end else begin VideoCapture1.Custom_Output_Audio_Codec := cbCustomDSFilterA.Items[cbCustomDSFilterA.ItemIndex]; VideoCapture1.Custom_Output_Audio_Codec_Use_Filters_Category := true; end; VideoCapture1. Custom_Output_Mux_Filter_Name := cbCustomMuxer.Items[cbCustomMuxer.ItemIndex]; ``` ```cpp // C++ MFC if (m_CustomUseVideoCodecsCat.GetCheck()) { CString videoCodec; m_CustomVideoCodecCombo.GetLBText(m_CustomVideoCodecCombo.GetCurSel(), videoCodec); m_VideoCapture.Custom_Output_Video_Codec = videoCodec; m_VideoCapture.Custom_Output_Video_Codec_Use_Filters_Category = false; } else { CString videoCodec; m_CustomDSFilterVCombo.GetLBText(m_CustomDSFilterVCombo.GetCurSel(), videoCodec); m_VideoCapture.Custom_Output_Video_Codec = videoCodec; m_VideoCapture.Custom_Output_Video_Codec_Use_Filters_Category = true; } if (m_CustomUseAudioCodecsCat.GetCheck()) { CString audioCodec; m_CustomAudioCodecCombo.GetLBText(m_CustomAudioCodecCombo.GetCurSel(), audioCodec); m_VideoCapture.Custom_Output_Audio_Codec = audioCodec; m_VideoCapture.Custom_Output_Audio_Codec_Use_Filters_Category = false; } else { CString audioCodec; m_CustomDSFilterACombo.GetLBText(m_CustomDSFilterACombo.GetCurSel(), audioCodec); m_VideoCapture.Custom_Output_Audio_Codec = audioCodec; m_VideoCapture.Custom_Output_Audio_Codec_Use_Filters_Category = true; } CString muxerName; m_CustomMuxerCombo.GetLBText(m_CustomMuxerCombo.GetCurSel(), muxerName); m_VideoCapture.Custom_Output_Mux_Filter_Name = muxerName; ``` ```vb ' VB6 If rbCustomUseVideoCodecsCat.Value Then VideoCapture1.Custom_Output_Video_Codec = cbCustomVideoCodec.List(cbCustomVideoCodec.ListIndex) VideoCapture1.Custom_Output_Video_Codec_Use_Filters_Category = False Else VideoCapture1.Custom_Output_Video_Codec = cbCustomDSFilterV.List(cbCustomDSFilterV.ListIndex) VideoCapture1.Custom_Output_Video_Codec_Use_Filters_Category = True End If If rbCustomUseAudioCodecsCat.Value Then VideoCapture1.Custom_Output_Audio_Codec = cbCustomAudioCodec.List(cbCustomAudioCodec.ListIndex) VideoCapture1.Custom_Output_Audio_Codec_Use_Filters_Category = False Else VideoCapture1.Custom_Output_Audio_Codec = cbCustomDSFilterA.List(cbCustomDSFilterA.ListIndex) VideoCapture1.Custom_Output_Audio_Codec_Use_Filters_Category = True End If VideoCapture1.Custom_Output_Mux_Filter_Name = cbCustomMuxer.List(cbCustomMuxer.ListIndex) ``` ## Second option Get lists of DirectShow filters. ```pascal for I := 0 to VideoCapture1.DirectShow_Filters_GetCount - 1 do begin cbCustomDSFilterV.Items.Add(VideoCapture1.DirectShow_Filters_GetItem(i)); cbCustomDSFilterA.Items.Add(VideoCapture1.DirectShow_Filters_GetItem(i)); cbCustomMuxer.Items.Add(VideoCapture1.DirectShow_Filters_GetItem(i)); cbCustomFilewriter.Items.Add(VideoCapture1.DirectShow_Filters_GetItem(i)); end; ``` ```cpp // C++ MFC for (int i = 0; i < m_VideoCapture.DirectShow_Filters_GetCount(); i++) { m_CustomDSFilterVCombo.AddString(m_VideoCapture.DirectShow_Filters_GetItem(i)); m_CustomDSFilterACombo.AddString(m_VideoCapture.DirectShow_Filters_GetItem(i)); m_CustomMuxerCombo.AddString(m_VideoCapture.DirectShow_Filters_GetItem(i)); m_CustomFilewriterCombo.AddString(m_VideoCapture.DirectShow_Filters_GetItem(i)); } ``` ```vb ' VB6 For i = 0 To VideoCapture1.DirectShow_Filters_GetCount - 1 cbCustomDSFilterV.AddItem VideoCapture1.DirectShow_Filters_GetItem(i) cbCustomDSFilterA.AddItem VideoCapture1.DirectShow_Filters_GetItem(i) cbCustomMuxer.AddItem VideoCapture1.DirectShow_Filters_GetItem(i) cbCustomFilewriter.AddItem VideoCapture1.DirectShow_Filters_GetItem(i) Next i ``` Select multiplexer (mux) filter ```pascal VideoCapture1.Custom_Output_Mux_Filter_Name := cbCustomMuxer.Items[cbCustomMuxer.ItemIndex]; VideoCapture1.Custom_Output_Mux_Filter_Is_Encoder := cbCustomMuxFilterIsEncoder.Checked; ``` ```cpp // C++ MFC CString muxerName; m_CustomMuxerCombo.GetLBText(m_CustomMuxerCombo.GetCurSel(), muxerName); m_VideoCapture.Custom_Output_Mux_Filter_Name = muxerName; m_VideoCapture.Custom_Output_Mux_Filter_Is_Encoder = m_CustomMuxFilterIsEncoder.GetCheck(); ``` ```vb ' VB6 VideoCapture1.Custom_Output_Mux_Filter_Name = cbCustomMuxer.List(cbCustomMuxer.ListIndex) VideoCapture1.Custom_Output_Mux_Filter_Is_Encoder = cbCustomMuxFilterIsEncoder.Value ``` If you need a special File Writer filter, you should specify it. This is true for both options described above. ```pascal VideoCapture1.Custom_Output_Special_FileWriter_Needed := cbUseSpecialFilewriter.Checked; VideoCapture1.Custom_Output_Special_FileWriter_Filter_Name := cbCustomFilewriter.Items[cbCustomFilewriter.ItemIndex]; ``` ```cpp // C++ MFC m_VideoCapture.Custom_Output_Special_FileWriter_Needed = m_UseSpecialFilewriter.GetCheck(); CString fileWriterName; m_CustomFilewriterCombo.GetLBText(m_CustomFilewriterCombo.GetCurSel(), fileWriterName); m_VideoCapture.Custom_Output_Special_FileWriter_Filter_Name = fileWriterName; ``` ```vb ' VB6 VideoCapture1.Custom_Output_Special_FileWriter_Needed = cbUseSpecialFilewriter.Value VideoCapture1.Custom_Output_Special_FileWriter_Filter_Name = cbCustomFilewriter.List(cbCustomFilewriter.ListIndex) ``` Start capture ```pascal VideoCapture1.Start; ``` ```cpp // C++ MFC m_VideoCapture.Start(); ``` ```vb ' VB6 VideoCapture1.Start ``` --- Please get in touch with [support](https://support.visioforge.com/) to get help with this tutorial. Visit our [GitHub](https://github.com/visioforge/) page to get more code samples. ---END OF PAGE--- # Local File: .\delphi\videocapture\deployment.md --- title: TVFVideoCapture Deployment Guide for Delphi description: Complete step-by-step deployment instructions for the TVFVideoCapture library in Delphi projects. Learn how to properly install necessary components, register DirectShow filters, and configure your development environment for successful application deployment. sidebar_label: Deployment --- # Complete TVFVideoCapture Library Deployment Guide When distributing applications built with the TVFVideoCapture library, you'll need to deploy several framework components to ensure proper functionality on end-user systems. This guide covers all deployment scenarios to help you create reliable installations. ## Deployment Options Overview You have two primary approaches for deploying the necessary components: automatic installers for simpler deployment or manual installation for more customized setups. ## Automatic Silent Installers (Requires Admin Rights) These pre-configured installers handle dependencies automatically and can be integrated into your application's installation process: ### Essential Components - **Base Package** (mandatory for all deployments) - [Delphi Version](http://files.visioforge.com/redists_delphi/redist_video_capture_base_delphi.exe) - [ActiveX Version](http://files.visioforge.com/redists_delphi/redist_video_capture_base_ax.exe) ### Optional Feature Components - **FFMPEG Package** (required for file or IP camera sources) - [x86 Architecture](http://files.visioforge.com/redists_delphi/redist_video_capture_ffmpeg.exe) - **MP4 Output Support** - [x86 Architecture](https://files.visioforge.com/redists_delphi/redist_video_capture_mp4.exe) - **VLC Source Package** (alternative option for file or IP camera sources) - [x86 Architecture](http://files.visioforge.com/redists_delphi/redist_video_capture_vlc.exe) ## Manual Installation Process (Requires Admin Rights) For more control over the deployment process, follow these detailed steps: ### Step 1: Install Required Dependencies 1. Deploy Visual C++ 2010 SP1 redistributables: - [x86 Architecture](http://files.visioforge.com/shared/vcredist_2010_x86.exe) - [x64 Architecture](http://files.visioforge.com/shared/vcredist_2010_x64.exe) ### Step 2: Deploy Core Components 1. Copy all Media Foundation Platform (MFP) DLLs from the `Redist\Filters` directory to your application folder 2. For ActiveX implementations: copy and register the OCX file using [regsvr32.exe](https://support.microsoft.com/en-us/help/249873/how-to-use-the-regsvr32-tool-and-troubleshoot-regsvr32-error-messages) ### Step 3: Register DirectShow Filters Using [regsvr32.exe](https://support.microsoft.com/en-us/help/249873/how-to-use-the-regsvr32-tool-and-troubleshoot-regsvr32-error-messages), register these essential DirectShow filters: - `VisioForge_Audio_Effects_4.ax` - `VisioForge_Dump.ax` - `VisioForge_RGB2YUV.ax` - `VisioForge_Screen_Capture.ax` - `VisioForge_Video_Effects_Pro.ax` - `VisioForge_Video_Mixer.ax` - `VisioForge_Video_Resize.ax` - `VisioForge_WavDest.ax` - `VisioForge_YUV2RGB.ax` - `VisioForge_FFMPEG_Source.ax` > **Important:** Add the filter directory to the system PATH environment variable if your application executable resides in a different folder. ## Advanced Component Installation ### FFMPEG Integration 1. Copy all files from `Redist\FFMPEG` folder to your deployment 2. Add the FFMPEG folder to the Windows system PATH variable 3. Register all .ax files from the FFMPEG folder ### VLC Integration 1. Copy all files from the `Redist\VLC` folder 2. Register the included .ax file using regsvr32.exe 3. Create an environment variable named `VLC_PLUGIN_PATH` pointing to the `VLC\plugins` directory ### Audio Output Support (LAME) 1. Copy `lame.ax` from the `Redist\Formats` folder 2. Register the `lame.ax` file using regsvr32.exe ### Container Format Support - **WebM Support:** Install free codecs from [xiph.org](https://www.xiph.com) - **Matroska Support:** Deploy `Haali Matroska Splitter` ### MP4 Output Configuration #### Modern Encoder Setup 1. Copy appropriate library files: - `libmfxsw32.dll` (for 32-bit deployments) - `libmfxsw64.dll` (for 64-bit deployments) 2. Register required components: - `VisioForge_H264_Encoder.ax` - `VisioForge_MP4_Muxer.ax` - `VisioForge_AAC_Encoder.ax` - `VisioForge_Video_Resize.ax` #### Legacy Encoder Setup (for older systems) 1. Copy appropriate library files: - `libmfxxp32.dll` (for 32-bit deployments) - `libmfxxp64.dll` (for 64-bit deployments) 2. Register required components: - `VisioForge_H264_Encoder_XP.ax` - `VisioForge_MP4_Muxer_XP.ax` - `VisioForge_AAC_Encoder_XP.ax` - `VisioForge_Video_Resize.ax` ## Bulk Registration Utility To simplify DirectShow filter registration, you can use the `reg_special.exe` utility from the framework setup. Place this executable in your filter directory and run it with administrator privileges to register all filters at once. --- For additional code samples and implementation examples, visit our [GitHub repository](https://github.com/visioforge/). If you encounter any difficulties with deployment, please contact [technical support](https://support.visioforge.com/) for personalized assistance. ---END OF PAGE--- # Local File: .\delphi\videocapture\dv-camcorder.md --- title: DV Camcorder Integration in Delphi Applications description: Master DV camcorder control in your Delphi applications with the TVFVideoCapture component. This guide provides detailed code examples for playback, navigation, and transport controls with practical implementations for real-world development scenarios. sidebar_label: DV camcorder control --- # Complete Guide to DV Camcorder Control This developer guide demonstrates how to effectively integrate and control Digital Video (DV) camcorders in your applications using the TVFVideoCapture component. The examples below include implementations for Delphi, C++ MFC, and Visual Basic 6, allowing you to choose the development environment that best suits your project requirements. ## Prerequisites for Implementation Before using any of the DV control commands, you must initialize your video capture system by starting either the video preview or capture process. This establishes the necessary connection between your application and the DV device. ## DV Transport Control Commands The following sections provide detailed implementation examples for each of the essential DV transport control functions, allowing you to create professional video manipulation applications. ### Starting Playback Initiate standard playback of your DV content with the `DV_PLAY` command. This command starts playback at normal speed and is essential for basic video viewing functionality. ```pascal VideoCapture1.DV_SendCommand(DV_PLAY); ``` ```cpp // C++ MFC m_VideoCapture.DV_SendCommand(DV_PLAY); ``` ```vb ' VB6 VideoCapture1.DV_SendCommand DV_PLAY ``` ### Pausing Video Playback Temporarily suspend video playback while maintaining the current position with the `DV_PAUSE` command. This is useful for implementing frame analysis or allowing users to examine specific content. ```pascal VideoCapture1.DV_SendCommand(DV_PAUSE); ``` ```cpp // C++ MFC m_VideoCapture.DV_SendCommand(DV_PAUSE); ``` ```vb ' VB6 VideoCapture1.DV_SendCommand DV_PAUSE ``` ### Stopping Playback Completely halt playback and reset the DV device to a ready state using the `DV_STOP` command. This typically returns the playback position to the beginning of the current section. ```pascal VideoCapture1.DV_SendCommand(DV_STOP); ``` ```cpp // C++ MFC m_VideoCapture.DV_SendCommand(DV_STOP); ``` ```vb ' VB6 VideoCapture1.DV_SendCommand DV_STOP ``` ### Advanced Navigation Controls #### Fast Forward Operation Rapidly advance through content with the `DV_FF` command. This allows users to quickly navigate to specific sections of the video. ```pascal VideoCapture1.DV_SendCommand(DV_FF); ``` ```cpp // C++ MFC m_VideoCapture.DV_SendCommand(DV_FF); ``` ```vb ' VB6 VideoCapture1.DV_SendCommand DV_FF ``` #### Rewind Operation Move backward through content at high speed with the `DV_REW` command. This function enables efficient navigation to previous sections of video. ```pascal VideoCapture1.DV_SendCommand(DV_REW); ``` ```cpp // C++ MFC m_VideoCapture.DV_SendCommand(DV_REW); ``` ```vb ' VB6 VideoCapture1.DV_SendCommand DV_REW ``` ## Frame-by-Frame Navigation For precision video analysis and editing applications, these commands enable frame-accurate navigation. ### Forward Frame Step Advance exactly one frame forward with the `DV_STEP_FW` command. This enables precise frame analysis and is essential for detailed video editing applications. ```pascal VideoCapture1.DV_SendCommand(DV_STEP_FW); ``` ```cpp // C++ MFC m_VideoCapture.DV_SendCommand(DV_STEP_FW); ``` ```vb ' VB6 VideoCapture1.DV_SendCommand DV_STEP_FW ``` ### Backward Frame Step Move exactly one frame backward with the `DV_STEP_REV` command. This complements the forward step function and allows for bidirectional frame-accurate navigation. ```pascal VideoCapture1.DV_SendCommand(DV_STEP_REV); ``` ```cpp // C++ MFC m_VideoCapture.DV_SendCommand(DV_STEP_REV); ``` ```vb ' VB6 VideoCapture1.DV_SendCommand DV_STEP_REV ``` ## Implementation Best Practices When integrating DV control functionality into your applications, consider the following practices: 1. Always verify device connectivity before sending commands 2. Implement proper error handling for cases when commands fail 3. Provide visual feedback to users when transport control states change 4. Consider implementing keyboard shortcuts for common DV control operations ## Additional Resources For more detailed information and advanced implementation techniques, explore our additional documentation and code repositories. Please contact our support team if you need assistance with implementation. Visit our GitHub repository for additional code samples and example projects. ---END OF PAGE--- # Local File: .\delphi\videocapture\fm-radio-tv-tuning.md --- title: Delphi FM Radio & TV Tuner Implementation Guide description: Master FM radio and TV tuning in Delphi applications with this detailed developer guide. Learn to implement channel scanning, frequency management, signal detection, and cross-platform code examples for both beginners and experienced programmers. sidebar_label: FM radio and TV tuning --- # Implementing FM Radio and TV Tuning in Delphi Applications ## Introduction to TV and Radio Tuning This guide provides detailed implementation examples for Delphi developers working with FM radio and TV tuning functionality. We've included equivalent code samples for C++ MFC and VB6 to support cross-platform development needs. ## Device Management ### Retrieving Available TV Tuners The first step in implementing tuner functionality is identifying available hardware devices: ```pascal // Iterate through all connected TV Tuner devices and populate dropdown for I := 0 to VideoCapture1.TVTuner_Devices_GetCount - 1 do cbTVTuner.Items.Add(VideoCapture1.TVTuner_Devices_GetItem(i)); ``` ```cpp // C++ MFC implementation for retrieving TV Tuner devices for (int i = 0; i < m_VideoCapture.TVTuner_Devices_GetCount(); i++) m_cbTVTuner.AddString(m_VideoCapture.TVTuner_Devices_GetItem(i)); ``` ```vb ' VB6 implementation for device enumeration For i = 0 To VideoCapture1.TVTuner_Devices_GetCount - 1 cbTVTuner.AddItem VideoCapture1.TVTuner_Devices_GetItem(i) Next i ``` ### Enumerating TV Format Support Different regions use different broadcast standards. Your application should detect and handle these formats: ```pascal // Load available TV formats (PAL, NTSC, SECAM, etc.) for I := 0 to VideoCapture1.TVTuner_TVFormats_GetCount - 1 do cbTVSystem.Items.Add(VideoCapture1.TVTuner_TVFormats_GetItem(i)); ``` ```cpp // C++ MFC - Populate TV format dropdown with available standards for (int i = 0; i < m_VideoCapture.TVTuner_TVFormats_GetCount(); i++) m_cbTVSystem.AddString(m_VideoCapture.TVTuner_TVFormats_GetItem(i)); ``` ```vb ' VB6 - Get supported TV formats for the selected tuner For i = 0 To VideoCapture1.TVTuner_TVFormats_GetCount - 1 cbTVSystem.AddItem VideoCapture1.TVTuner_TVFormats_GetItem(i) Next i ``` ### Country-Specific Configuration Broadcasting standards vary by country, so your application should provide appropriate region selection: ```pascal // Load country/region list for localized tuning parameters for I := 0 to VideoCapture1.TVTuner_Countries_GetCount - 1 do cbTVCountry.Items.Add(VideoCapture1.TVTuner_Countries_GetItem(i)); ``` ```cpp // C++ MFC - Build country selection list for regional settings for (int i = 0; i < m_VideoCapture.TVTuner_Countries_GetCount(); i++) m_cbTVCountry.AddString(m_VideoCapture.TVTuner_Countries_GetItem(i)); ``` ```vb ' VB6 - Populate country dropdown for regional broadcast settings For i = 0 To VideoCapture1.TVTuner_Countries_GetCount - 1 cbTVCountry.AddItem VideoCapture1.TVTuner_Countries_GetItem(i) Next i ``` ## Device Configuration ### Selecting a TV Tuner Device Once you've enumerated the available devices, users can select their preferred tuner: ```pascal // Set the active tuner device based on user selection VideoCapture1.TVTuner_Name := cbTVTuner.Items[cbTVTuner.ItemIndex]; ``` ```cpp // C++ MFC - Apply user's tuner device selection CString strText; m_cbTVTuner.GetLBText(m_cbTVTuner.GetCurSel(), strText); m_VideoCapture.put_TVTuner_Name(strText); ``` ```vb ' VB6 - Set selected tuner as active device VideoCapture1.TVTuner_Name = cbTVTuner.Text ``` ### Reading Current Tuner Configuration After selecting a device, you'll need to read its current settings: ```pascal // Initialize tuner and read current configuration VideoCapture1.TVTuner_Read; ``` ```cpp // C++ MFC - Load current tuner settings into application m_VideoCapture.TVTuner_Read(); ``` ```vb ' VB6 - Read tuner configuration after device selection VideoCapture1.TVTuner_Read ``` ### Available Operation Modes Tuners support different modes like TV, FM Radio, etc: ```pascal // Populate operation mode dropdown with available options for I := 0 to VideoCapture1.TVTuner_Modes_GetCount - 1 do cbTVMode.Items.Add(VideoCapture1.TVTuner_Modes_GetItem(i)); ``` ```cpp // C++ MFC - Get supported operational modes for this device for (int i = 0; i < m_VideoCapture.TVTuner_Modes_GetCount(); i++) m_cbTVMode.AddString(m_VideoCapture.TVTuner_Modes_GetItem(i)); ``` ```vb ' VB6 - List available tuner modes (TV, FM Radio, etc) For i = 0 To VideoCapture1.TVTuner_Modes_GetCount - 1 cbTVMode.AddItem VideoCapture1.TVTuner_Modes_GetItem(i) Next i ``` ## Frequency Management ### Reading Current Frequencies Display the current audio and video frequencies to provide user feedback: ```pascal // Display current video and audio frequencies in Hz edVideoFreq.Text := IntToStr(VideoCapture1.TVTuner_VideoFrequency); edAudiofreq.Text := IntToStr(VideoCapture1.TVTuner_AudioFrequency); ``` ```cpp // C++ MFC - Show current frequency values in the interface CString strFreq; strFreq.Format(_T("%d"), m_VideoCapture.get_TVTuner_VideoFrequency()); m_edVideoFreq.SetWindowText(strFreq); strFreq.Format(_T("%d"), m_VideoCapture.get_TVTuner_AudioFrequency()); m_edAudioFreq.SetWindowText(strFreq); ``` ```vb ' VB6 - Update frequency display fields with current values edVideoFreq.Text = CStr(VideoCapture1.TVTuner_VideoFrequency) edAudioFreq.Text = CStr(VideoCapture1.TVTuner_AudioFrequency) ``` ## Input and Mode Configuration ### Setting Signal Input Source Tuners may support multiple input sources that should be configurable: ```pascal // Select the appropriate input source based on current configuration cbTVInput.ItemIndex := cbTVInput.Items.IndexOf(VideoCapture1.TVTuner_InputType); ``` ```cpp // C++ MFC - Update input source selection in UI CString strInputType = m_VideoCapture.get_TVTuner_InputType(); m_cbTVInput.SetCurSel(m_cbTVInput.FindStringExact(-1, strInputType)); ``` ```vb ' VB6 - Set input source dropdown to match current configuration cbTVInput.ListIndex = cbTVInput.FindItem(VideoCapture1.TVTuner_InputType) ``` ### Configuring Operation Mode Different tuner modes require specific UI and parameter adjustments: ```pascal // Set operation mode dropdown to current mode (TV, FM Radio, etc) cbTVMode.ItemIndex := cbTVMode.Items.IndexOf(VideoCapture1.TVTuner_Mode); ``` ```cpp // C++ MFC - Update mode selector to match current tuner configuration CString strMode = m_VideoCapture.get_TVTuner_Mode(); m_cbTVMode.SetCurSel(m_cbTVMode.FindStringExact(-1, strMode)); ``` ```vb ' VB6 - Select current operating mode in dropdown cbTVMode.ListIndex = cbTVMode.FindItem(VideoCapture1.TVTuner_Mode) ``` ### TV Format Configuration Set the appropriate broadcast standard for the region: ```pascal // Configure the appropriate TV standard (PAL, NTSC, SECAM, etc) cbTVSystem.ItemIndex := cbTVSystem.Items.IndexOf(VideoCapture1.TVTuner_TVFormat); ``` ```cpp // C++ MFC - Set TV format dropdown to current broadcast standard CString strTVFormat = m_VideoCapture.get_TVTuner_TVFormat(); m_cbTVSystem.SetCurSel(m_cbTVSystem.FindStringExact(-1, strTVFormat)); ``` ```vb ' VB6 - Update TV system format selection cbTVSystem.ListIndex = cbTVSystem.FindItem(VideoCapture1.TVTuner_TVFormat) ``` ### Regional Settings Configure region-specific broadcast parameters: ```pascal // Set country/region for appropriate frequency tables and standards cbTVCountry.ItemIndex := cbTVCountry.Items.IndexOf(VideoCapture1.TVTuner_Country); ``` ```cpp // C++ MFC - Update country selection to match current setting CString strCountry = m_VideoCapture.get_TVTuner_Country(); m_cbTVCountry.SetCurSel(m_cbTVCountry.FindStringExact(-1, strCountry)); ``` ```vb ' VB6 - Set country dropdown to current regional setting cbTVCountry.ListIndex = cbTVCountry.FindItem(VideoCapture1.TVTuner_Country) ``` ## Channel Scanning ### Handling Channel Scan Events Implement the event handler for channel scanning process: ```pascal // Event handler for channel scanning process // Tracks progress and collects found channels procedure TForm1.VideoCapture1TVTunerTuneChannels(SignalPresent: Boolean; Channel, Frequency, Progress: Integer); begin // Update progress bar with current scan progress pbChannels.Position := Progress; // Add channel to list if signal is detected if SignalPresent then cbTVChannel.Items.Add(IntToStr(Channel)); // Scan complete when Channel = -1 if Channel = -1 then begin pbChannels.Position := 0; ShowMessage('AutoTune complete'); end; end; ``` ```cpp // C++ MFC - Channel scan event handler implementation // In header file (.h) BEGIN_EVENTSINK_MAP(CMainDlg, CDialog) ON_EVENT(CMainDlg, IDC_VIDEOCAPTURE, 1, OnTVTunerTuneChannels, VTS_BOOL VTS_I4 VTS_I4 VTS_I4) END_EVENTSINK_MAP() // In implementation file (.cpp) void CMainDlg::OnTVTunerTuneChannels(BOOL SignalPresent, long Channel, long Frequency, long Progress) { // Update scan progress indicator m_pbChannels.SetPos(Progress); // Add found channels to the selection list if (SignalPresent) { CString strChannel; strChannel.Format(_T("%d"), Channel); m_cbTVChannel.AddString(strChannel); } // Handle scan completion if (Channel == -1) { m_pbChannels.SetPos(0); MessageBox(_T("AutoTune complete"), _T("Information"), MB_OK | MB_ICONINFORMATION); } } ``` ```vb ' VB6 - Channel scan event handler Private Sub VideoCapture1_TVTunerTuneChannels(ByVal SignalPresent As Boolean, ByVal Channel As Long, ByVal Frequency As Long, ByVal Progress As Long) ' Update scan progress display pbChannels.Value = Progress ' Add channel to list when signal is found If SignalPresent Then cbTVChannel.AddItem CStr(Channel) End If ' Handle scan completion If Channel = -1 Then pbChannels.Value = 0 MsgBox "AutoTune complete", vbInformation End If End Sub ``` ### Initiating Channel Scan Start the automatic channel scanning process: ```pascal // Define frequency constants for clarity const KHz = 1000; const MHz = 1000000; // Initialize tuner with current settings VideoCapture1.TVTuner_Read; // Clear previous channel list cbTVChannel.Items.Clear; // Configure special parameters for FM Radio scanning if ( (cbTVMode.ItemIndex <> -1) and (cbTVMode.Items[cbTVMode.ItemIndex] = 'FM Radio') ) then begin // Set frequency range for FM scanning (100-110MHz) VideoCapture1.TVTuner_FM_Tuning_StartFrequency := 100 * Mhz; VideoCapture1.TVTuner_FM_Tuning_StopFrequency := 110 * MHz; // Set 100KHz increments for FM scanning VideoCapture1.TVTuner_FM_Tuning_Step := 100 * KHz; end; // Begin automatic channel scan VideoCapture1.TVTuner_TuneChannels_Start; ``` ```cpp // C++ MFC - Initiate channel scan with appropriate parameters const int KHz = 1000; const int MHz = 1000000; // Update tuner configuration m_VideoCapture.TVTuner_Read(); // Reset channel list before scanning m_cbTVChannel.ResetContent(); // Configure FM-specific parameters if in radio mode CString strMode; m_cbTVMode.GetLBText(m_cbTVMode.GetCurSel(), strMode); if (strMode == _T("FM Radio")) { // Set FM scan range (100-110MHz) m_VideoCapture.put_TVTuner_FM_Tuning_StartFrequency(100 * MHz); m_VideoCapture.put_TVTuner_FM_Tuning_StopFrequency(110 * MHz); // Use 100KHz steps for FM scanning m_VideoCapture.put_TVTuner_FM_Tuning_Step(100 * KHz); } // Start the channel scanning process m_VideoCapture.TVTuner_TuneChannels_Start(); ``` ```vb ' VB6 - Begin channel scanning process Const KHz = 1000 Const MHz = 1000000 ' Read current tuner configuration VideoCapture1.TVTuner_Read ' Clear existing channel list cbTVChannel.Clear ' Special configuration for FM Radio scanning If (cbTVMode.ListIndex <> -1) And (cbTVMode.Text = "FM Radio") Then ' Set FM band scan parameters (100-110MHz) VideoCapture1.TVTuner_FM_Tuning_StartFrequency = 100 * MHz VideoCapture1.TVTuner_FM_Tuning_StopFrequency = 110 * MHz ' Use 100KHz step size for FM scanning VideoCapture1.TVTuner_FM_Tuning_Step = 100 * KHz End If ' Initiate automatic channel scan VideoCapture1.TVTuner_TuneChannels_Start ``` ## Manual Tuning Operations ### Setting Channel by Number Allow direct channel selection by number: ```pascal // Change to specified channel number VideoCapture1.TVTuner_Channel := StrToInt(edChannel.Text); // Apply tuning changes VideoCapture1.TVTuner_Apply; ``` ```cpp // C++ MFC - Set tuner to specified channel number CString strChannel; m_edChannel.GetWindowText(strChannel); m_VideoCapture.put_TVTuner_Channel(_ttoi(strChannel)); m_VideoCapture.TVTuner_Apply(); ``` ```vb ' VB6 - Tune to specific channel number VideoCapture1.TVTuner_Channel = CInt(edChannel.Text) VideoCapture1.TVTuner_Apply ``` ### Setting Radio Frequency Directly For FM radio, direct frequency tuning is often required: ```pascal // Set channel to -1 for frequency-based tuning VideoCapture1.TVTuner_Channel := -1; // must be -1 to use frequency // Set specific frequency from input field VideoCapture1.TVTuner_Frequency := StrToInt(edChannel.Text); // Apply frequency change VideoCapture1.TVTuner_Apply; ``` ```cpp // C++ MFC - Direct frequency tuning implementation CString strFrequency; m_edChannel.GetWindowText(strFrequency); // Set channel to -1 to enable frequency-based tuning m_VideoCapture.put_TVTuner_Channel(-1); // must be -1 to use frequency // Apply the specified frequency m_VideoCapture.put_TVTuner_Frequency(_ttoi(strFrequency)); m_VideoCapture.TVTuner_Apply(); ``` ```vb ' VB6 - Manual frequency tuning for radio VideoCapture1.TVTuner_Channel = -1 ' must be -1 to use frequency VideoCapture1.TVTuner_Frequency = CInt(edChannel.Text) VideoCapture1.TVTuner_Apply ``` ## Conclusion This guide covers the essential aspects of implementing FM radio and TV tuning functionality in your Delphi applications. By following these examples, you can create robust tuning interfaces with proper channel scanning, frequency management, and signal detection. For optimal integration into your projects, remember to handle error conditions and provide appropriate user feedback during lengthy operations such as channel scanning. --- Please visit our [GitHub](https://github.com/visioforge/) page for additional code samples and implementation examples. ---END OF PAGE--- # Local File: .\delphi\videocapture\hardware-adjustments.md --- title: Camera Video Adjustments for Delphi Applications description: Master hardware video adjustments in Delphi applications - control brightness, contrast, saturation and more with TVFVideoCapture. This developer guide includes practical code examples and implementation strategies for building professional video capture solutions. sidebar_label: Hardware Video Adjustments --- # Implementing Hardware Video Adjustments in Delphi Applications ## Overview Modern video capture devices offer powerful hardware-level adjustments that can significantly enhance the quality of your video applications. By leveraging these capabilities in your Delphi applications, you can provide users with professional-grade video control features without complex software-based image processing. ## Supported Adjustment Types Most webcams and video capture devices support several adjustment parameters: - Brightness - Contrast - Saturation - Hue - Sharpness - Gamma - White balance - Gain ## Retrieving Available Adjustment Ranges Before setting adjustments, you'll need to determine what ranges are supported by the connected device. The `Video_CaptureDevice_VideoAdjust_GetRanges` method provides this information. ### Delphi Implementation ```pascal // Retrieve the available range for brightness adjustment // Returns minimum, maximum, step size, default value, and auto-adjustment capability VideoCapture1.Video_CaptureDevice_VideoAdjust_GetRanges(adj_Brightness, min, max, step, default, auto); ``` ### C++ MFC Implementation ```cpp // C++ MFC implementation for getting brightness adjustment ranges // Store results in integer variables for UI configuration int min, max, step, default_value; BOOL auto_value; m_VideoCapture.Video_CaptureDevice_VideoAdjust_GetRanges( VF_VIDEOCAP_ADJ_BRIGHTNESS, &min, &max, &step, &default_value, &auto_value); ``` ### VB6 Implementation ```vb ' VB6 implementation for retrieving brightness adjustment parameters ' Use these values to configure slider controls and checkboxes Dim min As Integer, max As Integer, step As Integer, default_val As Integer Dim auto_val As Boolean VideoCapture1.Video_CaptureDevice_VideoAdjust_GetRanges adj_Brightness, min, max, step, default_val, auto_val ``` ## Setting Adjustment Values Once you've determined the available ranges, you can use the `Video_CaptureDevice_VideoAdjust_SetValue` method to apply specific settings to the video stream. ### Delphi Implementation ```pascal // Set the brightness level based on trackbar position // The third parameter enables/disables automatic brightness adjustment VideoCapture1.Video_CaptureDevice_VideoAdjust_SetValue( adj_Brightness, tbAdjBrightness.Position, cbAdjBrightnessAuto.Checked); ``` ### C++ MFC Implementation ```cpp // C++ MFC implementation for setting brightness value // Uses slider position for manual adjustment value // Checkbox state determines if auto-adjustment is enabled m_VideoCapture.Video_CaptureDevice_VideoAdjust_SetValue( VF_VIDEOCAP_ADJ_BRIGHTNESS, m_sliderBrightness.GetPos(), m_checkBrightnessAuto.GetCheck() == BST_CHECKED); ``` ### VB6 Implementation ```vb ' VB6 implementation for applying brightness settings ' Uses trackbar value for adjustment level ' Checkbox value determines automatic adjustment mode VideoCapture1.Video_CaptureDevice_VideoAdjust_SetValue _ adj_Brightness, _ tbAdjBrightness.Value, _ cbAdjBrightnessAuto.Value = vbChecked ``` ## Best Practices for Video Adjustment Implementation When implementing video adjustments in your applications: 1. Always check device capabilities first, as not all devices support all adjustment types 2. Provide intuitive UI controls like sliders with appropriate min/max values 3. Include auto-adjustment options when available 4. Consider saving user preferences for future sessions 5. Implement real-time preview so users can see the effects of their adjustments ## Additional Resources Please contact our support team for assistance with implementing these features in your application. Visit our GitHub repository for additional code samples and implementation examples. ---END OF PAGE--- # Local File: .\delphi\videocapture\index.md --- title: TVFVideoCapture for Delphi Developers description: Professional video capture and processing library for Delphi developers. Create powerful applications with support for multiple devices, formats, and advanced video processing capabilities. Full ActiveX integration. sidebar_label: TVFVideoCapture --- # TVFVideoCapture Library for Delphi and ActiveX Development ## Introduction to Video Capture Technology The TVFVideoCapture library provides Delphi and ActiveX developers with a robust framework for implementing video and audio capture functionality in their applications. This powerful SDK enables seamless integration with a wide range of capture devices including: Read the full information on the [product page](https://www.visioforge.com/all-in-one-media-framework). ## Development Resources For detailed implementation guidance, explore these essential resources: - [Complete Changelog and Version History](changelog.md) - [Installation and Configuration Guide](install/index.md) - [Deployment Best Practices](deployment.md) - [Licensing Information and EULA](../../eula.md) - [Comprehensive API Documentation](https://api.visioforge.com/delphi/video_capture_sdk/index.html) ## Implementation Tutorials ### Audio Recording and Processing Master audio capture with these step-by-step guides: - [MP3 Audio Capture Implementation](audio-capture-mp3.md) - Learn how to capture audio streams and encode them directly to MP3 format with configurable bitrates and quality settings. - [WAV Audio Recording with Compression Options](audio-capture-wav.md) - Implement high-quality WAV audio recording with optional compression codecs and format configurations. - [Configuring Audio Output Devices](audio-output.md) - Guide to selecting and configuring audio output devices for monitoring and playback in your applications. ### Video Capture and Device Control Learn essential video handling techniques: - [AVI Video Capture Implementation](video-capture-avi.md) - Develop applications that capture video streams to AVI format with customizable codecs and container settings. - [DV Camcorder Control and Integration](dv-camcorder.md) - Connect and control DV camcorders through FireWire/IEEE-1394 with transport controls and metadata handling. - [Device Selection for Video and Audio Sources](video-audio-sources.md) - Techniques for enumerating, selecting, and managing multiple capture devices in your applications. - [Hardware Video Adjustment Parameters](hardware-adjustments.md) - Access and modify device-level parameters including brightness, contrast, saturation, and white balance. - [Video Input Configuration via Crossbar](video-input-crossbar.md) - Learn to configure video input routing through crossbar interfaces for multi-input capture devices. - [Video Renderer Selection and Configuration](video-renderer.md) - Choose and configure the optimal video rendering engine for your capture application. ### Advanced Media Techniques Explore sophisticated implementation scenarios: - [Custom Output Format Configuration](custom-output.md) - Create specialized output formats with custom compression settings and container configurations. - [FM Radio and TV Tuner Integration](fm-radio-tv-tuning.md) - Implement FM radio reception and TV channel tuning in applications with supported hardware. - [Network Streaming with WMV Format](network-streaming-wmv.md) - Stream captured video over networks using Windows Media Video format with bandwidth optimization. - [Resolution Management with Resize and Crop](resize-crop.md) - Process video frames with dynamic resizing and cropping to achieve custom output dimensions. - [Screen Capture Implementation](screen-capture.md) - Capture on-screen content with configurable frame rates and region selection capabilities. - [DV File Capture with Compression Options](video-capture-dv.md) - Save video directly to DV format or with recompression for optimized storage requirements. - [MPEG-2 Capture with TV Tuner Integration](mpeg2-capture.md) - Utilize hardware MPEG-2 encoders in TV tuners for efficient high-quality broadcast capture. - [Windows Media Video Capture with External Profiles](video-capture-wmv.md) - Implement Windows Media Video encoding with external profile configurations for optimized quality and size. ## Licensing Benefits Developers using this library receive significant advantages: - Royalty-free distribution for compiled applications - Regular updates with new features and optimizations - Priority technical support from development experts - Flexible licensing options for different project needs ---END OF PAGE--- # Local File: .\delphi\videocapture\mpeg2-capture.md --- title: Delphi MPEG-2 Video Capture with TV Tuner Hardware description: Master MPEG-2 video capture in Delphi applications using TV tuners with built-in hardware encoding. Our detailed guide covers device enumeration, format configuration, capture process implementation, and provides optimized code examples for professional media applications. Learn essential DirectShow techniques for high-quality video capture solutions. sidebar_label: MPEG-2 Capture with TV Tuner Hardware --- # MPEG-2 Video Capture in Delphi Using TV Tuner Hardware Encoders This comprehensive tutorial demonstrates how to implement high-quality MPEG-2 video capture functionality in your Delphi applications by leveraging TV tuners with built-in hardware encoding capabilities. Hardware encoding significantly reduces CPU usage while maintaining excellent video quality. ## Overview of MPEG-2 Hardware Encoding MPEG-2 hardware encoders provide superior performance compared to software-based encoding solutions. They're particularly useful for developing professional video capture applications that require efficient processing and high-quality output. ## Enumerating Available MPEG-2 Hardware Encoders The first step is to identify all available MPEG-2 hardware encoders in the system. This code demonstrates how to populate a dropdown with detected devices: ```pascal // List all available MPEG-2 hardware encoders in the system // This helps users select the appropriate encoding device VideoCapture1.Special_Filters_Fill; for I := 0 to VideoCapture1.Special_Filters_GetCount(SF_Hardware_Video_Encoder) - 1 do cbMPEGEncoder.Items.Add(VideoCapture1.Special_Filters_GetItem(SF_Hardware_Video_Encoder, i)); ``` ```cpp // C++ MFC implementation for MPEG-2 encoder enumeration // Populates a combobox with all detected hardware encoders m_VideoCapture.Special_Filters_Fill(); for (int i = 0; i < m_VideoCapture.Special_Filters_GetCount(SF_Hardware_Video_Encoder); i++) { CString encoderName = m_VideoCapture.Special_Filters_GetItem(SF_Hardware_Video_Encoder, i); m_cbMPEGEncoder.AddString(encoderName); } ``` ```vb ' VB6 implementation for finding hardware MPEG-2 encoders ' Lists all available encoders in a combobox control VideoCapture1.Special_Filters_Fill For i = 0 To VideoCapture1.Special_Filters_GetCount(SF_Hardware_Video_Encoder) - 1 cbMPEGEncoder.AddItem VideoCapture1.Special_Filters_GetItem(SF_Hardware_Video_Encoder, i) Next i ``` ## Selecting a Specific MPEG-2 Encoder After enumerating the available encoders, the next step is to select a specific encoder for use: ```pascal // Configure the component to use the selected MPEG-2 hardware encoder // This must be done before starting the capture process VideoCapture1.Video_CaptureDevice_InternalMPEGEncoder_Name := cbMPEGEncoder.Items[cbMPEGEncoder.ItemIndex]; ``` ```cpp // C++ MFC: Select and configure the chosen MPEG-2 hardware encoder // Retrieves the selected encoder name from the combobox int nIndex = m_cbMPEGEncoder.GetCurSel(); CString encoderName; m_cbMPEGEncoder.GetLBText(nIndex, encoderName); m_VideoCapture.Video_CaptureDevice_InternalMPEGEncoder_Name = encoderName; ``` ```vb ' VB6: Set the selected encoder as the active MPEG-2 hardware encoder ' Must be called before initializing the capture graph VideoCapture1.Video_CaptureDevice_InternalMPEGEncoder_Name = cbMPEGEncoder.List(cbMPEGEncoder.ListIndex) ``` ## Configuring DirectStream MPEG Format for Output To properly capture MPEG-2 encoded video, you need to set the appropriate output format: ```pascal // Set the output format to DirectStream MPEG // This enables proper handling of hardware-encoded MPEG-2 streams VideoCapture1.OutputFormat := Format_DirectStream_MPEG; ``` ```cpp // C++ MFC: Configure the output format for MPEG-2 encoded content // DirectStream MPEG format preserves the hardware-encoded stream m_VideoCapture.OutputFormat = Format_DirectStream_MPEG; ``` ```vb ' VB6: Set the proper output format for MPEG-2 hardware encoding ' DirectStream format ensures the encoded data is properly handled VideoCapture1.OutputFormat = Format_DirectStream_MPEG ``` ## Establishing Video Capture Mode Before starting the capture process, set the component to video capture mode: ```pascal // Configure the component for video capture operation // This prepares the internal DirectShow graph for recording VideoCapture1.Mode := Mode_Video_Capture; ``` ```cpp // C++ MFC: Set the component to video capture mode // Required before starting the MPEG-2 capture process m_VideoCapture.Mode = Mode_Video_Capture; ``` ```vb ' VB6: Set video capture mode before starting recording ' This initializes the appropriate DirectShow filters VideoCapture1.Mode = Mode_Video_Capture ``` ## Initiating the MPEG-2 Capture Process Finally, start the capture process to begin recording MPEG-2 video: ```pascal // Begin the video capture process with the configured settings // The component will now start recording to the specified output VideoCapture1.Start; ``` ```cpp // C++ MFC: Start the MPEG-2 video capture process // Recording begins with the previously configured settings m_VideoCapture.Start(); ``` ```vb ' VB6: Start the video capture with the current configuration ' The hardware encoder will now begin processing video data VideoCapture1.Start ``` ## Advanced MPEG-2 Capture Considerations When implementing MPEG-2 capture with hardware encoders, consider these additional factors: 1. Hardware encoders typically offer better performance than software-based solutions 2. Some TV tuners provide additional encoding parameters that can be customized 3. Buffer sizes may need adjustment for higher quality captures 4. Hardware encoders often handle video scaling and frame rate conversion internally ## Troubleshooting Common Issues If you encounter problems with MPEG-2 hardware encoding: 1. Verify that your TV tuner device supports hardware MPEG-2 encoding 2. Ensure proper driver installation for the capture device 3. Check that DirectX is properly installed and updated 4. Consider system resource availability, as some encoders require specific resources Please contact our dedicated support team for assistance with implementing this tutorial in your specific application. Visit our GitHub repository for additional code samples and implementation examples. ---END OF PAGE--- # Local File: .\delphi\videocapture\network-streaming-wmv.md --- title: Network WMV Streaming in Delphi Applications description: Master Windows Media Video network streaming in Delphi applications. Learn how to configure WMV profiles, manage client connections, set network ports, and implement robust video broadcasting functionality with our detailed implementation guide and code examples. sidebar_label: Network streaming using WMV format --- # WMV Network Streaming Implementation Guide ## Overview This guide demonstrates how to implement network-based video broadcasting using Windows Media Video (WMV) format in your Delphi applications. The techniques shown here allow you to stream video content over networks while simultaneously capturing and saving the video to a file for archival purposes. ## Requirements Before implementing WMV network streaming, ensure that you have: - A supported video capture device connected to your system - Proper network access and permissions - A valid WMV profile file with encoder settings ## Implementation Steps ### Basic Configuration To enable WMV network streaming in your application, you'll need to configure several essential parameters: 1. Enable network streaming functionality 2. Specify a WMV profile file containing video encoding parameters 3. Set the maximum number of concurrent client connections 4. Define the network port for client connections ### Delphi Implementation Code ```pascal // Delphi code for configuring WMV network streaming // Enable the network streaming functionality VideoCapture1.Network_Streaming_Enabled := true; // Set the path to the WMV profile file containing encoder settings // This file defines video quality, bitrate, resolution, etc. VideoCapture1.Network_Streaming_WMV_Profile_FileName := edNetworkStreamingWMVProfile.Text; // Define maximum number of concurrent clients that can connect VideoCapture1.Network_Streaming_Maximum_Clients := StrToInt(edMaximumClients.Text); // Specify the network port that clients will use to connect VideoCapture1.Network_Streaming_Network_Port := StrToInt(edNetworkPort.Text); ``` ### C++ MFC Implementation ```cpp // C++ MFC implementation for WMV network streaming // Enable streaming functionality m_VideoCapture.SetNetwork_Streaming_Enabled(true); // Set WMV profile path - contains encoding parameters m_VideoCapture.SetNetwork_Streaming_WMV_Profile_FileName(edNetworkStreamingWMVProfile.GetWindowText()); // Define maximum concurrent client connections m_VideoCapture.SetNetwork_Streaming_Maximum_Clients(_ttoi(edMaximumClients.GetWindowText())); // Set the network port for client connections m_VideoCapture.SetNetwork_Streaming_Network_Port(_ttoi(edNetworkPort.GetWindowText())); ``` ### VB6 Implementation ```vb ' VB6 (ActiveX) implementation for WMV network streaming ' Enable network streaming capabilities VideoCapture1.Network_Streaming_Enabled = True ' Set the profile file containing video encoder settings VideoCapture1.Network_Streaming_WMV_Profile_FileName = txtNetworkStreamingWMVProfile.Text ' Define maximum number of clients allowed to connect simultaneously VideoCapture1.Network_Streaming_Maximum_Clients = CInt(txtMaximumClients.Text) ' Specify the network port for client connections VideoCapture1.Network_Streaming_Network_Port = CInt(txtNetworkPort.Text) ``` ## Client Connection Information After configuring the streaming parameters, your application can obtain the connection URL that clients will use to access the video stream: ```pascal // Get the URL that clients will use to connect to the stream // This URL can be shared with users who need to view the stream strStreamURL := VideoCapture1.Network_Streaming_URL; ``` This URL can be used with Windows Media Player or any other application that supports Windows Media streaming protocols. ## Best Practices For optimal streaming performance, consider the following recommendations: - Use appropriate bitrates based on your network capabilities - Monitor client connections to ensure system stability - Test your streaming configuration with various client applications - Consider network bandwidth limitations when setting quality parameters ## Troubleshooting If you encounter issues with your streaming implementation: - Verify network firewall settings allow traffic on your selected port - Ensure the WMV profile file exists and contains valid settings - Check that the maximum client count is appropriate for your server resources - Validate network connectivity between the server and potential clients --- Please get in touch with [support](https://support.visioforge.com/) if you have questions about this implementation. Visit our [GitHub](https://github.com/visioforge/) page for additional code samples and resources. ---END OF PAGE--- # Local File: .\delphi\videocapture\resize-crop.md --- title: Delphi Video Processing - Resize & Crop Tutorial description: Step-by-step guide for implementing video resizing and cropping in Delphi applications. Includes code examples for real-time video processing, aspect ratio handling, and performance optimization techniques for developers. sidebar_label: Resize and crop --- # Video Resizing and Cropping in Delphi TVFVideoCapture Video manipulation is a critical component of many modern applications. This guide provides detailed instructions for implementing real-time video resizing and cropping in your Delphi applications with minimal performance impact. ## Why Resize or Crop Video? Video resizing and cropping serve multiple purposes in development: - Optimize video for different display sizes - Reduce bandwidth requirements for streaming - Focus on specific regions of interest - Create uniform video dimensions across your application - Improve performance on resource-constrained devices ## Enabling Resize and Crop Functionality Before applying any transformations, you must enable the resize/crop functionality in the TVFVideoCapture component. ### Step 1: Enable the Feature ```pascal // Enable video resizing or cropping functionality VideoCapture1.Video_ResizeOrCrop_Enabled := true; ``` ```cpp // C++ MFC - Enable video transformation capabilities m_VideoCapture.SetVideo_ResizeOrCrop_Enabled(TRUE); ``` ```vb ' VB6 - Activate resize/crop features VideoCapture1.Video_ResizeOrCrop_Enabled = True ``` ## Video Resizing Implementation Resizing allows you to change the dimensions of your video stream while maintaining visual quality. ### Setting New Dimensions ```pascal // Set the desired width and height for the resized video output VideoCapture1.Video_Resize_NewWidth := StrToInt(edResizeWidth.Text); VideoCapture1.Video_Resize_NewHeight := StrToInt(edResizeHeight.Text); ``` ```cpp // C++ MFC - Configure target dimensions for video resize m_VideoCapture.SetVideo_Resize_NewWidth(_ttoi(m_strResizeWidth)); m_VideoCapture.SetVideo_Resize_NewHeight(_ttoi(m_strResizeHeight)); ``` ```vb ' VB6 - Define new video dimensions VideoCapture1.Video_Resize_NewWidth = CInt(txtResizeWidth.Text) VideoCapture1.Video_Resize_NewHeight = CInt(txtResizeHeight.Text) ``` ### Handling Aspect Ratio Changes When resizing video, you can choose between preserving the original aspect ratio (letterbox) or stretching the content to fit the new dimensions. ```pascal // Letterbox mode adds black borders to preserve aspect ratio // When false, the video will stretch to fit the new dimensions VideoCapture1.Video_Resize_LetterBox := cbResizeLetterbox.Checked; ``` ```cpp // C++ MFC - Configure aspect ratio handling method m_VideoCapture.SetVideo_Resize_LetterBox(m_bResizeLetterbox); ``` ```vb ' VB6 - Set letterbox mode for aspect ratio preservation VideoCapture1.Video_Resize_LetterBox = chkResizeLetterbox.Value ``` ### Selecting Resize Algorithms Choose from multiple resize algorithms based on your quality requirements and performance constraints: ```pascal // Select the appropriate resize algorithm: // - NearestNeighbor: Fastest but lowest quality // - Bilinear: Good balance between speed and quality // - Bilinear_HQ: Enhanced bilinear with improved quality // - Bicubic: Better quality with moderate performance impact // - Bicubic_HQ: Highest quality with highest CPU usage case cbResizeMode.ItemIndex of 0: VideoCapture1.Video_Resize_Mode := rm_NearestNeighbor; 1: VideoCapture1.Video_Resize_Mode := rm_Bilinear; 2: VideoCapture1.Video_Resize_Mode := rm_Bilinear_HQ; 3: VideoCapture1.Video_Resize_Mode := rm_Bicubic; 4: VideoCapture1.Video_Resize_Mode := rm_Bicubic_HQ; end; ``` ```cpp // C++ MFC - Set the resize algorithm based on quality/performance needs switch(m_nResizeMode) { case 0: m_VideoCapture.SetVideo_Resize_Mode(rm_NearestNeighbor); break; // Fastest case 1: m_VideoCapture.SetVideo_Resize_Mode(rm_Bilinear); break; // Standard case 2: m_VideoCapture.SetVideo_Resize_Mode(rm_Bilinear_HQ); break; // Enhanced case 3: m_VideoCapture.SetVideo_Resize_Mode(rm_Bicubic); break; // High quality case 4: m_VideoCapture.SetVideo_Resize_Mode(rm_Bicubic_HQ); break; // Maximum quality } ``` ```vb ' VB6 - Choose resize algorithm based on quality and performance needs Select Case cboResizeMode.ListIndex Case 0: VideoCapture1.Video_Resize_Mode = rm_NearestNeighbor ' Fastest, lower quality Case 1: VideoCapture1.Video_Resize_Mode = rm_Bilinear ' Balanced option Case 2: VideoCapture1.Video_Resize_Mode = rm_Bilinear_HQ ' Enhanced bilinear Case 3: VideoCapture1.Video_Resize_Mode = rm_Bicubic ' Better quality Case 4: VideoCapture1.Video_Resize_Mode = rm_Bicubic_HQ ' Highest quality End Select ``` ## Video Cropping Implementation Cropping allows you to select a specific region of interest from your video stream. ### Step 1: Enable Cropping As with resizing, you must first enable the feature: ```pascal // Enable video transformation capabilities before applying crop VideoCapture1.Video_ResizeOrCrop_Enabled := true; ``` ```cpp // C++ MFC - Activate video manipulation features m_VideoCapture.SetVideo_ResizeOrCrop_Enabled(TRUE); ``` ```vb ' VB6 - Enable video transformation functionality VideoCapture1.Video_ResizeOrCrop_Enabled = True ``` ### Step 2: Define Crop Region Specify the boundaries of your crop region by defining the left, top, right, and bottom coordinates: ```pascal // Define the crop region coordinates in pixels // These values represent the distance from each edge of the original video VideoCapture1.Video_Crop_Left := StrToInt(edCropLeft.Text); VideoCapture1.Video_Crop_Top := StrToInt(edCropTop.Text); VideoCapture1.Video_Crop_Right := StrToInt(edCropRight.Text); VideoCapture1.Video_Crop_Bottom := StrToInt(edCropBottom.Text); ``` ```cpp // C++ MFC - Set the crop boundaries in pixels // Each value defines how many pixels to crop from the respective edge m_VideoCapture.SetVideo_Crop_Left(_ttoi(m_strCropLeft)); m_VideoCapture.SetVideo_Crop_Top(_ttoi(m_strCropTop)); m_VideoCapture.SetVideo_Crop_Right(_ttoi(m_strCropRight)); m_VideoCapture.SetVideo_Crop_Bottom(_ttoi(m_strCropBottom)); ``` ```vb ' VB6 - Configure crop region boundaries ' Values represent pixel counts from each edge to exclude VideoCapture1.Video_Crop_Left = CInt(txtCropLeft.Text) VideoCapture1.Video_Crop_Top = CInt(txtCropTop.Text) VideoCapture1.Video_Crop_Right = CInt(txtCropRight.Text) VideoCapture1.Video_Crop_Bottom = CInt(txtCropBottom.Text) ``` ## Best Practices for Video Manipulation For optimal results when implementing video resizing and cropping: 1. **Test on target hardware** - Different resize algorithms have varying CPU requirements 2. **Consider your use case** - For real-time applications, favor performance over quality 3. **Maintain aspect ratios** - Unless specifically needed, preserve original proportions 4. **Combine operations judiciously** - Applying both resize and crop increases processing overhead 5. **Cache settings** - Avoid changing parameters frequently during capture ## Troubleshooting Common Issues - If performance is poor, try a faster resize algorithm - Ensure crop values don't exceed the dimensions of your video stream - When using letterbox mode, account for the black borders in your UI design - For best results, resize to dimensions that are multiples of 8 or 16 --- For additional code samples and implementation examples, visit our [GitHub](https://github.com/visioforge/) repository. Need technical assistance? Contact our support team for personalized guidance. ---END OF PAGE--- # Local File: .\delphi\videocapture\screen-capture.md --- title: Screen Recording in Delphi Applications description: Master screen recording functionality in your Delphi applications with TVFVideoCapture. Learn to capture selected screen regions, record full screen content, customize frame rates, track cursor movements, and implement high-quality screen capture features with our detailed guide and code examples. sidebar_label: Screen Capture --- # Screen Recording Implementation in Delphi ## Introduction to Screen Capture Functionality TVFVideoCapture provides powerful screen recording capabilities for Delphi developers. This guide walks through the implementation of screen capture features in your applications, allowing you to record specific regions or the entire screen with customizable settings. ## Configuring Screen Capture Area You can precisely control which portion of the screen to record by setting coordinate parameters. This is particularly useful when you want to focus on specific application windows or screen regions. ### Setting Specific Screen Coordinates Use these parameters to define the exact boundaries of your capture area: ```pascal // Define the top edge position of the capture rectangle (in pixels) VideoCapture1.Screen_Capture_Top := StrToInt(edScreenTop.Text); // Define the bottom edge position of the capture rectangle (in pixels) VideoCapture1.Screen_Capture_Bottom := StrToInt(edScreenBottom.Text); // Define the left edge position of the capture rectangle (in pixels) VideoCapture1.Screen_Capture_Left := StrToInt(edScreenLeft.Text); // Define the right edge position of the capture rectangle (in pixels) VideoCapture1.Screen_Capture_Right := StrToInt(edScreenRight.Text); ``` ```cpp // Define the top edge position of the capture rectangle (in pixels) m_VideoCapture.SetScreen_Capture_Top(atoi(m_edScreenTop.GetWindowText())); // Define the bottom edge position of the capture rectangle (in pixels) m_VideoCapture.SetScreen_Capture_Bottom(atoi(m_edScreenBottom.GetWindowText())); // Define the left edge position of the capture rectangle (in pixels) m_VideoCapture.SetScreen_Capture_Left(atoi(m_edScreenLeft.GetWindowText())); // Define the right edge position of the capture rectangle (in pixels) m_VideoCapture.SetScreen_Capture_Right(atoi(m_edScreenRight.GetWindowText())); ``` ```vb ' Define the top edge position of the capture rectangle (in pixels) VideoCapture1.Screen_Capture_Top = CInt(edScreenTop.Text) ' Define the bottom edge position of the capture rectangle (in pixels) VideoCapture1.Screen_Capture_Bottom = CInt(edScreenBottom.Text) ' Define the left edge position of the capture rectangle (in pixels) VideoCapture1.Screen_Capture_Left = CInt(edScreenLeft.Text) ' Define the right edge position of the capture rectangle (in pixels) VideoCapture1.Screen_Capture_Right = CInt(edScreenRight.Text) ``` ### Capturing the Full Screen For complete screen recording, simply enable the full screen capture option: ```pascal // Enable full screen capture mode - will record the entire display VideoCapture1.Screen_Capture_FullScreen := true; ``` ```cpp // Enable full screen capture mode - will record the entire display m_VideoCapture.SetScreen_Capture_FullScreen(true); ``` ```vb ' Enable full screen capture mode - will record the entire display VideoCapture1.Screen_Capture_FullScreen = True ``` ## Optimizing Frame Rate Settings The frame rate directly impacts both the quality and file size of your screen recordings. Higher frame rates produce smoother video but generate larger files. ```pascal // Set capture frame rate to 10 frames per second // Adjust this value based on your performance requirements VideoCapture1.Screen_Capture_FrameRate := 10; ``` ```cpp // Set capture frame rate to 10 frames per second // Adjust this value based on your performance requirements m_VideoCapture.SetScreen_Capture_FrameRate(10); ``` ```vb ' Set capture frame rate to 10 frames per second ' Adjust this value based on your performance requirements VideoCapture1.Screen_Capture_FrameRate = 10 ``` ## Cursor Tracking Configuration For instructional videos or demonstrations, capturing the mouse cursor movement is essential: ```pascal // Enable mouse cursor capture in the recording // Set to false to hide cursor in the output video VideoCapture1.Screen_Capture_Grab_Mouse_Cursor := true; ``` ```cpp // Enable mouse cursor capture in the recording // Set to false to hide cursor in the output video m_VideoCapture.SetScreen_Capture_Grab_Mouse_Cursor(true); ``` ```vb ' Enable mouse cursor capture in the recording ' Set to false to hide cursor in the output video VideoCapture1.Screen_Capture_Grab_Mouse_Cursor = True ``` ## Activating Screen Capture Mode After configuring all settings, set the component to screen capture mode to begin recording: ```pascal // Set component to screen capture operational mode // This activates all screen recording functionality VideoCapture1.Mode := Mode_Screen_Capture; ``` ```cpp // Set component to screen capture operational mode // This activates all screen recording functionality m_VideoCapture.SetMode(Mode_Screen_Capture); ``` ```vb ' Set component to screen capture operational mode ' This activates all screen recording functionality VideoCapture1.Mode = Mode_Screen_Capture ``` ## Advanced Implementation Tips For optimal screen recording performance: - Consider system resources when selecting frame rates - Use region capture when possible to minimize processing load - Test different quality settings to balance file size and visual quality - Remember that cursor capture adds slight processing overhead --- For additional code samples and implementation examples, visit our [GitHub](https://github.com/visioforge/) repository. For technical assistance with implementation, please contact our [support team](https://support.visioforge.com/). ---END OF PAGE--- # Local File: .\delphi\videocapture\video-audio-sources.md --- title: Delphi Video Capture - Device Selection Guide description: Master video and audio device selection in Delphi applications with practical code examples. Learn to implement device listing, format configuration, frame rate settings, and audio input selection with step-by-step Delphi, C++ MFC, and VB6 code samples. sidebar_label: How to select video and audio capture devices? --- # Code sample - How to select video and audio capture devices? Delphi, C++ MFC and VB6 sample code ## Select video source ### Get a list of available video capture devices ```pascal for i := 0 to VideoCapture1.Video_CaptureDevices_GetCount - 1 do cbVideoInputDevice.Items.Add(VideoCapture1.Video_CaptureDevices_GetItem(i)); ``` ```cpp // C++ MFC for (int i = 0; i < m_VideoCapture.Video_CaptureDevices_GetCount(); i++) m_cbVideoInputDevice.AddString(m_VideoCapture.Video_CaptureDevices_GetItem(i)); ``` ```vb ' VB6 For i = 0 To VideoCapture1.Video_CaptureDevices_GetCount - 1 cbVideoInputDevice.AddItem VideoCapture1.Video_CaptureDevices_GetItem(i) Next i ``` ### Select the video input device ```pascal VideoCapture1.Video_CaptureDevice := cbVideoInputDevice.Items[cbVideoInputDevice.ItemIndex]; ``` ```cpp // C++ MFC CString strDevice; m_cbVideoInputDevice.GetLBText(m_cbVideoInputDevice.GetCurSel(), strDevice); m_VideoCapture.put_Video_CaptureDevice(strDevice); ``` ```vb ' VB6 VideoCapture1.Video_CaptureDevice = cbVideoInputDevice.Text ``` ### Get a list of available video formats ```pascal VideoCapture1.Video_CaptureDevice_Formats_Fill; for I := 0 to VideoCapture1.Video_CaptureDevice_Formats_GetCount - 1 do cbVideoInputFormat.Items.Add(VideoCapture1.Video_CaptureDevice_Formats_GetItem(i)); ``` ```cpp // C++ MFC m_VideoCapture.Video_CaptureDevice_Formats_Fill(); for (int i = 0; i < m_VideoCapture.Video_CaptureDevice_Formats_GetCount(); i++) m_cbVideoInputFormat.AddString(m_VideoCapture.Video_CaptureDevice_Formats_GetItem(i)); ``` ```vb ' VB6 VideoCapture1.Video_CaptureDevice_Formats_Fill For i = 0 To VideoCapture1.Video_CaptureDevice_Formats_GetCount - 1 cbVideoInputFormat.AddItem VideoCapture1.Video_CaptureDevice_Formats_GetItem(i) Next i ``` ### Select video format ```pascal VideoCapture1.Video_CaptureFormat := cbVideoInputFormat.Items[cbVideoInputFormat.ItemIndex]; ``` ```cpp // C++ MFC CString strFormat; m_cbVideoInputFormat.GetLBText(m_cbVideoInputFormat.GetCurSel(), strFormat); m_VideoCapture.put_Video_CaptureFormat(strFormat); ``` ```vb ' VB6 VideoCapture1.Video_CaptureFormat = cbVideoInputFormat.Text ``` or ### Automatically choose the best video format ```pascal VideoCapture1.Video_CaptureFormat_UseBest := cbUseBestVideoInputFormat.Checked; ``` ```cpp // C++ MFC m_VideoCapture.put_Video_CaptureFormat_UseBest(m_cbUseBestVideoInputFormat.GetCheck() == BST_CHECKED); ``` ```vb ' VB6 VideoCapture1.Video_CaptureFormat_UseBest = cbUseBestVideoInputFormat.Value ``` ### Get a list of available frame rates ```pascal VideoCapture1.Video_CaptureDevice_FrameRates_Fill; for I := 0 to VideoCapture1.Video_CaptureDevice_FrameRates_GetCount - 1 do cbFrameRate.Items.Add(VideoCapture1.Video_CaptureDevice_FrameRates_GetItem(i)); ``` ```cpp // C++ MFC m_VideoCapture.Video_CaptureDevice_FrameRates_Fill(); for (int i = 0; i < m_VideoCapture.Video_CaptureDevice_FrameRates_GetCount(); i++) m_cbFrameRate.AddString(m_VideoCapture.Video_CaptureDevice_FrameRates_GetItem(i)); ``` ```vb ' VB6 VideoCapture1.Video_CaptureDevice_FrameRates_Fill For i = 0 To VideoCapture1.Video_CaptureDevice_FrameRates_GetCount - 1 cbFrameRate.AddItem VideoCapture1.Video_CaptureDevice_FrameRates_GetItem(i) Next i ``` ### Select frame rate ```pascal VideoCapture1.Video_FrameRate := StrToFloat(cbFrameRate.Items[cbFrameRate.ItemIndex]); ``` ```cpp // C++ MFC CString strFrameRate; m_cbFrameRate.GetLBText(m_cbFrameRate.GetCurSel(), strFrameRate); m_VideoCapture.put_Video_FrameRate(_wtof(strFrameRate)); ``` ```vb ' VB6 VideoCapture1.Video_FrameRate = CDbl(cbFrameRate.Text) ``` Select needed video input (configure crossbar) if needed. ## Select audio source ### Use video capture device as audio source ```pascal VideoCapture1.Video_CaptureDevice_IsAudioSource := true; ``` ```cpp // C++ MFC m_VideoCapture.put_Video_CaptureDevice_IsAudioSource(true); ``` ```vb ' VB6 VideoCapture1.Video_CaptureDevice_IsAudioSource = True ``` or ### Get a list of available audio capture devices ```pascal for I := 0 to VideoCapture1.Audio_CaptureDevices_GetCount - 1 do cbAudioInputDevice.Items.Add(VideoCapture1.Audio_CaptureDevices_GetItem(i)); ``` ```cpp // C++ MFC for (int i = 0; i < m_VideoCapture.Audio_CaptureDevices_GetCount(); i++) m_cbAudioInputDevice.AddString(m_VideoCapture.Audio_CaptureDevices_GetItem(i)); ``` ```vb ' VB6 For i = 0 To VideoCapture1.Audio_CaptureDevices_GetCount - 1 cbAudioInputDevice.AddItem VideoCapture1.Audio_CaptureDevices_GetItem(i) Next i ``` ### Select the audio input device ```pascal VideoCapture1.Audio_CaptureDevice := cbAudioInputDevice.Items[cbAudioInputDevice.ItemIndex]; ``` ```cpp // C++ MFC CString strAudioDevice; m_cbAudioInputDevice.GetLBText(m_cbAudioInputDevice.GetCurSel(), strAudioDevice); m_VideoCapture.put_Audio_CaptureDevice(strAudioDevice); ``` ```vb ' VB6 VideoCapture1.Audio_CaptureDevice = cbAudioInputDevice.Text ``` ### Get a list of available audio formats ```pascal VideoCapture1.Audio_CaptureDevice_Formats_Fill; for I := 0 to VideoCapture1.Audio_CaptureDevice_Formats_GetCount - 1 do cbAudioInputFormat.Items.Add(VideoCapture1.Audio_CaptureDevice_Formats_GetItem(i)); ``` ```cpp // C++ MFC m_VideoCapture.Audio_CaptureDevice_Formats_Fill(); for (int i = 0; i < m_VideoCapture.Audio_CaptureDevice_Formats_GetCount(); i++) m_cbAudioInputFormat.AddString(m_VideoCapture.Audio_CaptureDevice_Formats_GetItem(i)); ``` ```vb ' VB6 VideoCapture1.Audio_CaptureDevice_Formats_Fill For i = 0 To VideoCapture1.Audio_CaptureDevice_Formats_GetCount - 1 cbAudioInputFormat.AddItem VideoCapture1.Audio_CaptureDevice_Formats_GetItem(i) Next i ``` ### Select the format ```pascal VideoCapture1.Audio_CaptureFormat := cbAudioInputFormat.Items[cbAudioInputFormat.ItemIndex]; ``` ```cpp // C++ MFC CString strAudioFormat; m_cbAudioInputFormat.GetLBText(m_cbAudioInputFormat.GetCurSel(), strAudioFormat); m_VideoCapture.put_Audio_CaptureFormat(strAudioFormat); ``` ```vb ' VB6 VideoCapture1.Audio_CaptureFormat = cbAudioInputFormat.Text ``` or ### Automatically choose the best audio format ```pascal VideoCapture1.Audio_CaptureFormat_UseBest := cbUseBestAudioInputFormat.Checked; ``` ```cpp // C++ MFC m_VideoCapture.put_Audio_CaptureFormat_UseBest(m_cbUseBestAudioInputFormat.GetCheck() == BST_CHECKED); ``` ```vb ' VB6 VideoCapture1.Audio_CaptureFormat_UseBest = cbUseBestAudioInputFormat.Value ``` ### Get a list of available audio inputs (lines) ```pascal VideoCapture1.Audio_CaptureDevice_Lines_Fill; for I := 0 to VideoCapture1.Audio_CaptureDevice_Lines_GetCount - 1 do cbAudioInputLine.Items.Add(VideoCapture1.Audio_CaptureDevice_Lines_GetItem(i)); ``` ```cpp // C++ MFC m_VideoCapture.Audio_CaptureDevice_Lines_Fill(); for (int i = 0; i < m_VideoCapture.Audio_CaptureDevice_Lines_GetCount(); i++) m_cbAudioInputLine.AddString(m_VideoCapture.Audio_CaptureDevice_Lines_GetItem(i)); ``` ```vb ' VB6 VideoCapture1.Audio_CaptureDevice_Lines_Fill For i = 0 To VideoCapture1.Audio_CaptureDevice_Lines_GetCount - 1 cbAudioInputLine.AddItem VideoCapture1.Audio_CaptureDevice_Lines_GetItem(i) Next i ``` ### Select audio input ```pascal VideoCapture1.Audio_CaptureLine := cbAudioInputLine.Items[cbAudioInputLine.ItemIndex]; ``` ```cpp // C++ MFC CString strAudioLine; m_cbAudioInputLine.GetLBText(m_cbAudioInputLine.GetCurSel(), strAudioLine); m_VideoCapture.put_Audio_CaptureLine(strAudioLine); ``` ```vb ' VB6 VideoCapture1.Audio_CaptureLine = cbAudioInputLine.Text ``` --- Please get in touch with [support](https://support.visioforge.com/) to get help with this tutorial. Visit our [GitHub](https://github.com/visioforge/) page to get more code samples. ---END OF PAGE--- # Local File: .\delphi\videocapture\video-capture-avi.md --- title: Video Capture to AVI Files in Delphi Applications description: Learn how to implement video capture functionality to AVI files in your Delphi applications using TVFVideoCapture component. This guide covers codec selection, audio configuration, and complete implementation steps with code examples. sidebar_label: Video capture to AVI file --- # Complete Guide to Video Capture to AVI Files in Delphi When developing multimedia applications in Delphi, video capture functionality is often a critical requirement. This guide explores how to implement high-quality video capture to AVI files using the TVFVideoCapture component in Delphi applications. We'll cover everything from setting up codecs to configuring audio parameters and starting the capture process. ## Understanding AVI Video Capture in Delphi The TVFVideoCapture component provides a powerful and flexible way to capture video directly to AVI format in Delphi applications. AVI (Audio Video Interleave) remains a popular video container format due to its broad compatibility and reliability for recording purposes. When implementing video capture in your Delphi application, you'll need to consider several key aspects: 1. Selecting appropriate video and audio codecs 2. Configuring audio parameters 3. Setting the output format and capture mode 4. Managing the capture process This guide provides detailed explanations and code samples for each of these steps. ## Working with Video and Audio Codecs ### Retrieving Available Codecs Before capturing video, you'll need to populate your application with the available video and audio codecs. The TVFVideoCapture component makes this straightforward: ```pascal procedure TMyForm.PopulateCodecLists; var I: Integer; begin // Clear existing items cbVideoCodecs.Items.Clear; cbAudioCodecs.Items.Clear; // Populate video codecs for I := 0 to VideoCapture1.Video_Codecs_GetCount - 1 do cbVideoCodecs.Items.Add(VideoCapture1.Video_Codecs_GetItem(i)); // Populate audio codecs for I := 0 to VideoCapture1.Audio_Codecs_GetCount - 1 do cbAudioCodecs.Items.Add(VideoCapture1.Audio_Codecs_GetItem(i)); end; ``` For developers using C++ MFC, the equivalent code would be: ```cpp void CMyDialog::PopulateCodecLists() { // Clear existing items m_VideoCodecsCombo.ResetContent(); m_AudioCodecsCombo.ResetContent(); // Populate video codecs for (int i = 0; i < m_VideoCapture.Video_Codecs_GetCount(); i++) { CString codecName = m_VideoCapture.Video_Codecs_GetItem(i); m_VideoCodecsCombo.AddString(codecName); } // Populate audio codecs for (int i = 0; i < m_VideoCapture.Audio_Codecs_GetCount(); i++) { CString codecName = m_VideoCapture.Audio_Codecs_GetItem(i); m_AudioCodecsCombo.AddString(codecName); } } ``` For VB6 developers, here's how to implement the same functionality: ```vb Private Sub PopulateCodecLists() ' Clear existing items cboVideoCodecs.Clear cboAudioCodecs.Clear ' Populate video codecs Dim i As Integer For i = 0 To VideoCapture1.Video_Codecs_GetCount - 1 cboVideoCodecs.AddItem VideoCapture1.Video_Codecs_GetItem(i) Next i ' Populate audio codecs For i = 0 To VideoCapture1.Audio_Codecs_GetCount - 1 cboAudioCodecs.AddItem VideoCapture1.Audio_Codecs_GetItem(i) Next i End Sub ``` ### Selecting Codecs for Capture Once you've populated the lists, you'll need to let users select their preferred codecs and apply those selections to the capture component: ```pascal procedure TMyForm.ApplyCodecSelections; begin if cbVideoCodecs.ItemIndex >= 0 then VideoCapture1.Video_Codec := cbVideoCodecs.Items[cbVideoCodecs.ItemIndex]; if cbAudioCodecs.ItemIndex >= 0 then VideoCapture1.Audio_Codec := cbAudioCodecs.Items[cbAudioCodecs.ItemIndex]; end; ``` C++ MFC implementation: ```cpp void CMyDialog::ApplyCodecSelections() { int videoIndex = m_VideoCodecsCombo.GetCurSel(); if (videoIndex >= 0) { CString videoCodec; m_VideoCodecsCombo.GetLBText(videoIndex, videoCodec); m_VideoCapture.Video_Codec = videoCodec; } int audioIndex = m_AudioCodecsCombo.GetCurSel(); if (audioIndex >= 0) { CString audioCodec; m_AudioCodecsCombo.GetLBText(audioIndex, audioCodec); m_VideoCapture.Audio_Codec = audioCodec; } } ``` VB6 implementation: ```vb Private Sub ApplyCodecSelections() If cboVideoCodecs.ListIndex >= 0 Then VideoCapture1.Video_Codec = cboVideoCodecs.Text End If If cboAudioCodecs.ListIndex >= 0 Then VideoCapture1.Audio_Codec = cboAudioCodecs.Text End If End Sub ``` ## Configuring Audio Parameters Quality audio capture requires proper configuration of three key parameters: 1. **Audio Channels**: Typically 1 (mono) or 2 (stereo) 2. **Bits Per Sample (BPS)**: Common values include 8, 16, or 24 bits 3. **Sample Rate**: Standard rates include 44100 Hz (CD quality) or 48000 Hz Here's how to apply these settings in Delphi: ```pascal procedure TMyForm.ConfigureAudioSettings; begin // Apply audio channel configuration (mono/stereo) VideoCapture1.Audio_Channels := StrToInt(cbChannels.Items[cbChannels.ItemIndex]); // Set bits per sample for audio quality VideoCapture1.Audio_BPS := StrToInt(cbBPS.Items[cbBPS.ItemIndex]); // Configure sample rate VideoCapture1.Audio_SampleRate := StrToInt(cbSampleRate.Items[cbSampleRate.ItemIndex]); end; ``` C++ MFC implementation: ```cpp void CMyDialog::ConfigureAudioSettings() { CString channelStr, bpsStr, sampleRateStr; // Get selected values from combo boxes m_ChannelsCombo.GetLBText(m_ChannelsCombo.GetCurSel(), channelStr); m_BpsCombo.GetLBText(m_BpsCombo.GetCurSel(), bpsStr); m_SampleRateCombo.GetLBText(m_SampleRateCombo.GetCurSel(), sampleRateStr); // Apply audio channel configuration m_VideoCapture.Audio_Channels = _ttoi(channelStr); // Set bits per sample m_VideoCapture.Audio_BPS = _ttoi(bpsStr); // Configure sample rate m_VideoCapture.Audio_SampleRate = _ttoi(sampleRateStr); } ``` VB6 implementation: ```vb Private Sub ConfigureAudioSettings() ' Apply audio channel configuration VideoCapture1.Audio_Channels = CInt(cboChannels.Text) ' Set bits per sample VideoCapture1.Audio_BPS = CInt(cboBPS.Text) ' Configure sample rate VideoCapture1.Audio_SampleRate = CInt(cboSampleRate.Text) End Sub ``` ## Configuring Output Format and Capture Mode The next step is to configure the output format as AVI and set the appropriate capture mode: ```pascal procedure TMyForm.PrepareForCapture; begin // Set AVI as the output format VideoCapture1.OutputFormat := Format_AVI; // Configure video capture mode VideoCapture1.Mode := Mode_Video_Capture; end; ``` C++ MFC implementation: ```cpp void CMyDialog::PrepareForCapture() { // Set AVI as the output format m_VideoCapture.OutputFormat = Format_AVI; // Configure video capture mode m_VideoCapture.Mode = Mode_Video_Capture; } ``` VB6 implementation: ```vb Private Sub PrepareForCapture() ' Set AVI as the output format VideoCapture1.OutputFormat = Format_AVI ' Configure video capture mode VideoCapture1.Mode = Mode_Video_Capture End Sub ``` ## Starting and Managing the Capture Process Once everything is configured, you can start the capture process: ```pascal procedure TMyForm.StartCapture; begin try // Set output filename VideoCapture1.Output := ExtractFilePath(Application.ExeName) + 'CapturedVideo.avi'; // Begin capture process VideoCapture1.Start; // Update UI to show capture in progress btnStart.Enabled := False; btnStop.Enabled := True; lblStatus.Caption := 'Recording...'; except on E: Exception do ShowMessage('Error starting capture: ' + E.Message); end; end; ``` C++ MFC implementation: ```cpp void CMyDialog::StartCapture() { try { TCHAR appPath[MAX_PATH]; GetModuleFileName(NULL, appPath, MAX_PATH); CString appDir = appPath; int pos = appDir.ReverseFind('\\'); if (pos != -1) { appDir = appDir.Left(pos + 1); } // Set output filename m_VideoCapture.Output = appDir + _T("CapturedVideo.avi"); // Begin capture process m_VideoCapture.Start(); // Update UI GetDlgItem(IDC_START_BUTTON)->EnableWindow(FALSE); GetDlgItem(IDC_STOP_BUTTON)->EnableWindow(TRUE); SetDlgItemText(IDC_STATUS_STATIC, _T("Recording...")); } catch (COleDispatchException* e) { CString errorMsg = _T("Error starting capture: "); errorMsg += e->m_strDescription; MessageBox(errorMsg, _T("Error"), MB_ICONERROR); e->Delete(); } } ``` VB6 implementation: ```vb Private Sub StartCapture() On Error GoTo ErrorHandler ' Set output filename VideoCapture1.Output = App.Path & "\CapturedVideo.avi" ' Begin capture process VideoCapture1.Start ' Update UI btnStart.Enabled = False btnStop.Enabled = True lblStatus.Caption = "Recording..." Exit Sub ErrorHandler: MsgBox "Error starting capture: " & Err.Description, vbExclamation End Sub ``` ## Handling Capture Completion It's important to provide functionality to stop the capture process: ```pascal procedure TMyForm.StopCapture; begin try // Stop the capture process VideoCapture1.Stop; // Update UI btnStart.Enabled := True; btnStop.Enabled := False; lblStatus.Caption := 'Capture completed'; // Optionally open the captured file if FileExists(VideoCapture1.Output) and (MessageDlg('Open captured video?', mtConfirmation, [mbYes, mbNo], 0) = mrYes) then ShellExecute(0, 'open', PChar(VideoCapture1.Output), nil, nil, SW_SHOW); except on E: Exception do ShowMessage('Error stopping capture: ' + E.Message); end; end; ``` C++ MFC implementation: ```cpp void CMyDialog::StopCapture() { try { // Stop the capture process m_VideoCapture.Stop(); // Update UI GetDlgItem(IDC_START_BUTTON)->EnableWindow(TRUE); GetDlgItem(IDC_STOP_BUTTON)->EnableWindow(FALSE); SetDlgItemText(IDC_STATUS_STATIC, _T("Capture completed")); // Optionally open the captured file CString outputFile = m_VideoCapture.Output; if (PathFileExists(outputFile) && MessageBox(_T("Open captured video?"), _T("Confirmation"), MB_YESNO | MB_ICONQUESTION) == IDYES) { ShellExecute(NULL, _T("open"), outputFile, NULL, NULL, SW_SHOW); } } catch (COleDispatchException* e) { CString errorMsg = _T("Error stopping capture: "); errorMsg += e->m_strDescription; MessageBox(errorMsg, _T("Error"), MB_ICONERROR); e->Delete(); } } ``` VB6 implementation: ```vb Private Sub StopCapture() On Error GoTo ErrorHandler ' Stop the capture process VideoCapture1.Stop ' Update UI btnStart.Enabled = True btnStop.Enabled = False lblStatus.Caption = "Capture completed" ' Optionally open the captured file If Dir(VideoCapture1.Output) <> "" Then If MsgBox("Open captured video?", vbQuestion + vbYesNo) = vbYes Then Shell "explorer.exe """ & VideoCapture1.Output & """", vbNormalFocus End If End If Exit Sub ErrorHandler: MsgBox "Error stopping capture: " & Err.Description, vbExclamation End Sub ``` ## Conclusion Implementing video capture to AVI files in Delphi applications using the TVFVideoCapture component is a straightforward process when you understand the key concepts. By following this guide, you can create robust multimedia applications with professional video capture functionality. The TVFVideoCapture component provides a wide range of additional features and customization options beyond what's covered in this guide, including video effects, overlays, and device property configuration. Remember to test your video capture implementation thoroughly with different codecs and audio configurations to ensure the best quality for your specific use case. --- For additional code samples and implementation guidance, visit our GitHub repository. If you need further assistance with this tutorial, our support team is available to help. ---END OF PAGE--- # Local File: .\delphi\videocapture\video-capture-dv.md --- title: Delphi Video Capture to DV File Format Guide description: Complete guide for Delphi developers on implementing video capture functionality to DV file format with or without compression. Learn step-by-step implementation techniques with working code examples for professional video applications. sidebar_label: Video capture to DV file --- # Video Capture to DV File Format: Implementation Guide Digital Video (DV) remains a reliable format for video capture applications, particularly when working with legacy systems or specific professional requirements. This guide explores how to implement DV video capture functionality in your Delphi applications, with additional C++ MFC and VB6 examples for cross-platform reference. ## Understanding DV Format Options DV format offers several advantages for video capture applications: - Consistent quality with minimal generation loss - Efficient storage for professional video content - Support for both PAL and NTSC standards - Compatibility with professional video editing software - Reliable audio synchronization When implementing DV video capture, developers have two primary approaches: 1. **Direct Stream Capture** - Raw DV data without recompression 2. **Recompressed DV** - Processed video with customizable settings Each approach serves different use cases depending on your application requirements. ## Direct Stream Capture Implementation Direct stream capture provides the highest quality by avoiding any recompression of the video signal. This method is ideal for archival purposes and professional video production where maintaining the original signal integrity is crucial. ### Configuring DV Type Settings The first step in implementing direct stream capture is setting the appropriate DV type configuration: #### Delphi ```pascal VideoCapture1.DV_Capture_Type2 := rbDVType2.Checked; ``` #### C++ MFC ```cpp m_videoCapture.SetDVCaptureType2(m_rbDVType2.GetCheck() == BST_CHECKED); ``` #### VB6 ```vb VideoCapture1.DV_Capture_Type2 = rbDVType2.Value ``` The DV Type setting determines the specific format variation used for capture. Most modern applications use Type 2, which offers better compatibility with editing software. ### Setting Output Format for Direct Stream For direct stream capture, you must specify the DirectStream_DV format: #### Delphi ```pascal VideoCapture1.OutputFormat := Format_DirectStream_DV; ``` #### C++ MFC ```cpp m_videoCapture.SetOutputFormat(FORMAT_DIRECTSTREAM_DV); ``` #### VB6 ```vb VideoCapture1.OutputFormat = FORMAT_DIRECTSTREAM_DV ``` This ensures the video data is stored without additional processing or compression. ### Configuring Capture Mode Next, set the component to video capture mode: #### Delphi ```pascal VideoCapture1.Mode := Mode_Video_Capture; ``` #### C++ MFC ```cpp m_videoCapture.SetMode(MODE_VIDEO_CAPTURE); ``` #### VB6 ```vb VideoCapture1.Mode = MODE_VIDEO_CAPTURE ``` This prepares the component for continuous video acquisition rather than single-frame capture. ### Initiating Direct Stream Capture With all settings in place, you can begin the capture process: #### Delphi ```pascal VideoCapture1.Start; ``` #### C++ MFC ```cpp m_videoCapture.Start(); ``` #### VB6 ```vb VideoCapture1.Start ``` The component will now capture the video stream directly to the specified output location in DV format. ## Implementing DV Capture with Recompression In some scenarios, you may need to modify the DV stream during capture. This approach allows for customization of audio parameters and video format standards. ### Configuring Audio Parameters DV format supports multiple audio configurations. Set the channels and sample rate to match your requirements: #### Delphi ```pascal VideoCapture1.DV_Capture_Audio_Channels := StrToInt(cbDVChannels.Items[cbDVChannels.ItemIndex]); VideoCapture1.DV_Capture_Audio_SampleRate := StrToInt(cbDVSampleRate.Items[cbDVSampleRate.ItemIndex]); ``` #### C++ MFC ```cpp CString channelStr, sampleRateStr; m_cbDVChannels.GetLBText(m_cbDVChannels.GetCurSel(), channelStr); m_cbDVSampleRate.GetLBText(m_cbDVSampleRate.GetCurSel(), sampleRateStr); m_videoCapture.SetDVCaptureAudioChannels(_ttoi(channelStr)); m_videoCapture.SetDVCaptureAudioSampleRate(_ttoi(sampleRateStr)); ``` #### VB6 ```vb VideoCapture1.DV_Capture_Audio_Channels = CInt(cbDVChannels.List(cbDVChannels.ListIndex)) VideoCapture1.DV_Capture_Audio_SampleRate = CInt(cbDVSampleRate.List(cbDVSampleRate.ListIndex)) ``` Standard DV audio options include: - Channels: 1 (mono) or 2 (stereo) - Sample rates: 32000 Hz, 44100 Hz, or 48000 Hz ### Setting Video Format Standard DV supports both PAL and NTSC standards. Select the appropriate standard for your target region: #### Delphi ```pascal if rbDVPAL.Checked then VideoCapture1.DV_Capture_Video_Format := DVF_PAL else VideoCapture1.DV_Capture_Video_Format := DVF_NTSC; ``` #### C++ MFC ```cpp if (m_rbDVPAL.GetCheck() == BST_CHECKED) m_videoCapture.SetDVCaptureVideoFormat(DVF_PAL); else m_videoCapture.SetDVCaptureVideoFormat(DVF_NTSC); ``` #### VB6 ```vb If rbDVPAL.Value Then VideoCapture1.DV_Capture_Video_Format = DVF_PAL Else VideoCapture1.DV_Capture_Video_Format = DVF_NTSC End If ``` Remember that: - PAL: 720×576 resolution at 25 fps (used in Europe, Australia, parts of Asia) - NTSC: 720×480 resolution at 29.97 fps (used in North America, Japan, parts of South America) ### DV Type Selection As with direct streaming, specify the DV type for recompressed capture: #### Delphi ```pascal VideoCapture1.DV_Capture_Type2 := rbDVType2.Checked; ``` #### C++ MFC ```cpp m_videoCapture.SetDVCaptureType2(m_rbDVType2.GetCheck() == BST_CHECKED); ``` #### VB6 ```vb VideoCapture1.DV_Capture_Type2 = rbDVType2.Value ``` ### Setting Output Format for Recompression For recompressed DV capture, specify the DV format rather than DirectStream_DV: #### Delphi ```pascal VideoCapture1.OutputFormat := Format_DV; VideoCapture1.Mode := Mode_Video_Capture; ``` #### C++ MFC ```cpp m_videoCapture.SetOutputFormat(FORMAT_DV); m_videoCapture.SetMode(MODE_VIDEO_CAPTURE); ``` #### VB6 ```vb VideoCapture1.OutputFormat = FORMAT_DV VideoCapture1.Mode = MODE_VIDEO_CAPTURE ``` This tells the component to process the stream through the DV codec during capture. ### Starting Recompressed Capture With all parameters configured, begin the capture process: #### Delphi ```pascal VideoCapture1.Start; ``` #### C++ MFC ```cpp m_videoCapture.Start(); ``` #### VB6 ```vb VideoCapture1.Start ``` ## Best Practices for DV Capture Implementation When implementing DV capture in your applications, consider these recommendations: 1. **Pre-allocate sufficient disk space** - DV format requires approximately 13 GB per hour of footage 2. **Implement capture time limits** - DV files have a 4 GB size limit on some file systems 3. **Monitor system resources** - DV capture requires consistent CPU and disk performance 4. **Provide format selection UI** - Let users choose between direct stream and recompressed options 5. **Test with various camera models** - DV implementation can vary between manufacturers ## Error Handling Considerations Robust DV capture implementations should include error handling for these common scenarios: - Device disconnection during capture - Disk space exhaustion - Buffer overrun conditions - Invalid format settings - Codec compatibility issues ## Conclusion Implementing DV video capture in your Delphi, C++ MFC, or VB6 applications provides a solid foundation for professional video acquisition workflows. Whether you choose direct stream capture for maximum quality or recompressed capture for additional flexibility, the DV format offers reliable performance for specialized video applications. By following the implementation examples in this guide, you can integrate professional-grade video capture capabilities into your custom software solutions. --- Need additional assistance with your video capture implementation? Visit our [GitHub](https://github.com/visioforge/) page for more code samples or contact our [support team](https://support.visioforge.com/) for personalized guidance. ---END OF PAGE--- # Local File: .\delphi\videocapture\video-capture-wmv.md --- title: Video Capture to WMV - Implementation Guide description: Learn how to implement video capture functionality to Windows Media Video (WMV) files in your applications. This step-by-step guide covers external profile selection, output format configuration, and capture execution for Delphi, C++ MFC, and VB6 platforms. sidebar_label: Video capture to WMV file --- # Video Capture to Windows Media Video (WMV) Using External Profiles ## Introduction Capturing video to Windows Media Video (WMV) format is a common requirement in many software applications. This guide provides a detailed walkthrough of implementing video capture functionality using external WMV profiles in Delphi, C++ MFC, and VB6 applications. The WMV format remains popular due to its compatibility with Windows platforms and efficient compression algorithms that balance quality and file size. ## Understanding WMV and External Profiles Windows Media Video (WMV) is a compressed video file format developed by Microsoft as part of the Windows Media framework. When capturing video to WMV format, using external profiles allows for greater flexibility and customization of the output. External profiles contain pre-configured settings that define: - Video resolution - Bitrate - Frame rate - Compression quality - Audio settings - Other encoding parameters By leveraging external profiles, developers can quickly implement different quality presets without having to manually configure each parameter in code. ## Implementation Steps ### Step 1: Setting Up Your Environment Before implementing video capture functionality, ensure your development environment is properly configured: 1. Install the necessary video capture component 2. Add the component reference to your project 3. Design your user interface to include: - A file selector for choosing the WMV profile - Output file location selector - Video capture preview window - Start/Stop capture controls ### Step 2: Selecting a WMV Profile The first step in the implementation is to specify which WMV profile to use for encoding. This profile contains all the encoding parameters that will be applied to the captured video. #### Delphi ```pascal VideoCapture1.WMV_Profile_Filename := "output.wmv"; ``` #### C++ MFC ```cpp m_videoCapture.SetWMVProfileFilename(_T("output.wmv")); ``` #### VB6 ```vb VideoCapture1.WMV_Profile_Filename = "output.wmv" ``` ### Step 3: Configuring the Output Format Once the profile is selected, you need to configure the component to use WMV as the output format. This tells the capture component which encoder to use for processing the video stream. #### Delphi ```pascal VideoCapture1.OutputFormat := Format_WMV; ``` #### C++ MFC ```cpp m_videoCapture.SetOutputFormat(FORMAT_WMV); ``` #### VB6 ```vb VideoCapture1.OutputFormat = FORMAT_WMV ``` ### Step 4: Setting the Capture Mode The capture component can operate in various modes, so it's important to explicitly set it to video capture mode. #### Delphi ```pascal VideoCapture1.Mode := Mode_Video_Capture; ``` #### C++ MFC ```cpp m_videoCapture.SetMode(MODE_VIDEO_CAPTURE); ``` #### VB6 ```vb VideoCapture1.Mode = MODE_VIDEO_CAPTURE ``` This ensures that the component is configured for continuous video recording rather than other modes like snapshot capture or streaming. ### Step 5: Starting the Video Capture With all the configuration in place, the final step is to start the actual capture process. #### Delphi ```pascal VideoCapture1.Start; ``` #### C++ MFC ```cpp m_videoCapture.Start(); ``` #### VB6 ```vb VideoCapture1.Start ``` This command begins the capture process using all the previously configured settings. ## Advanced Configuration Options ### Custom Output File Naming You can implement custom file naming for your captured video files: #### Delphi ```pascal VideoCapture1.Output_Filename := 'C:\Captures\Video_' + FormatDateTime('yyyymmdd_hhnnss', Now) + '.wmv'; ``` #### C++ MFC ```cpp CTime currentTime = CTime::GetCurrentTime(); CString fileName; fileName.Format(_T("C:\\Captures\\Video_%04d%02d%02d_%02d%02d%02d.wmv"), currentTime.GetYear(), currentTime.GetMonth(), currentTime.GetDay(), currentTime.GetHour(), currentTime.GetMinute(), currentTime.GetSecond()); m_videoCapture.SetOutputFilename(fileName); ``` #### VB6 ```vb VideoCapture1.Output_Filename = "C:\Captures\Video_" & Format(Now, "yyyymmdd_hhnnss") & ".wmv" ``` These examples create a timestamped filename to ensure each captured file has a unique name. When designing your application, consider these best practices: 1. Always verify device availability before attempting capture 2. Provide feedback during long encoding operations 3. Include a preview window so users can see what's being captured 4. Implement a file size monitor for long recordings 5. Test with various WMV profiles to ensure compatibility ## Conclusion Implementing video capture to WMV format using external profiles provides flexibility and control over the capture process. The approach outlined in this guide works effectively in Delphi, C++ MFC, and VB6 development environments, allowing you to integrate professional-grade video capture capabilities into your applications. By using external profiles, you can quickly switch between different quality settings without changing your code, which is ideal for applications that need to adapt to different use cases or hardware capabilities. --- For additional code samples, visit our GitHub repository. If you need technical assistance with implementation, our support team is available to help. ---END OF PAGE--- # Local File: .\delphi\videocapture\video-input-crossbar.md --- title: Delphi Video Input Source Selection with Crossbar description: Master video input source selection in Delphi applications using crossbar technology. Learn to programmatically configure composite, S-Video, HDMI inputs with step-by-step code examples. Implement robust camera input switching for professional Delphi video capture applications. sidebar_label: Video Input Selection (Crossbar) --- # Selecting Video Input Sources with Crossbar Technology ## Introduction to Video Input Selection When developing applications that capture video from external devices, you'll often need to handle multiple input sources. The crossbar is a crucial component in video capture systems that allows you to route different physical inputs (like composite, S-Video, HDMI) to your application. This guide walks you through the process of detecting, configuring, and selecting video inputs using the crossbar interface in Delphi, C++ MFC, and Visual Basic 6 applications. ## Understanding Crossbar Technology Crossbar technology functions as a routing matrix in video capture devices, enabling the connection between various inputs and outputs. Modern capture cards and TV tuners frequently incorporate crossbar functionality to facilitate switching between different video sources such as: - Composite video inputs - S-Video connections - Component video - HDMI inputs - TV tuner inputs - Digital video interfaces Properly configuring these connections programmatically is essential for applications that need to dynamically switch between different video sources. ## Implementation Steps Overview The implementation process for configuring crossbar connections in your application involves three main steps: 1. Initializing the crossbar interface and verifying its availability 2. Enumerating available video inputs for selection 3. Connecting the selected input to the video decoder output Let's examine each step in detail with sample code for Delphi, C++ MFC, and VB6 environments. ## Detailed Implementation Guide ### Step 1: Initialize the Crossbar Interface Before you can work with input selection, you need to initialize the crossbar interface and verify it's available on the current capture device. #### Delphi Implementation ```pascal // Initialize the crossbar interface CrossBarFound := VideoCapture1.Video_CaptureDevice_CrossBar_Init; // Check if crossbar functionality is available if CrossBarFound then ShowMessage('Crossbar functionality detected and initialized') else ShowMessage('No crossbar available on this capture device'); ``` #### C++ MFC Implementation ```cpp // Initialize the crossbar interface BOOL bCrossBarFound = m_videoCapture.Video_CaptureDevice_CrossBar_Init(); // Check if crossbar functionality is available if (bCrossBarFound) { AfxMessageBox(_T("Crossbar functionality detected and initialized")); } else { AfxMessageBox(_T("No crossbar available on this capture device")); } ``` #### VB6 Implementation ```vb ' Initialize the crossbar interface Dim CrossBarFound As Boolean CrossBarFound = VideoCapture1.Video_CaptureDevice_CrossBar_Init() ' Check if crossbar functionality is available If CrossBarFound Then MsgBox "Crossbar functionality detected and initialized" Else MsgBox "No crossbar available on this capture device" End If ``` The initialization function returns a boolean value indicating whether the crossbar functionality is available on the current capture device. Not all capture devices support crossbar functionality, so this check is crucial. ### Step 2: Enumerate Available Video Inputs Once you've confirmed that the crossbar is available, the next step is to retrieve a list of available inputs for the "Video Decoder" output. This allows users to select from available physical connections. #### Delphi Implementation ```pascal // Clear any existing connections and UI elements VideoCapture1.Video_CaptureDevice_CrossBar_ClearConnections; cbCrossbarVideoInput.Clear; // Get count of available inputs for the "Video Decoder" output var inputCount: Integer := VideoCapture1.Video_CaptureDevice_CrossBar_GetInputsForOutput_GetCount('Video Decoder'); // Populate UI with available inputs for i := 0 to inputCount - 1 do begin var inputName: String := VideoCapture1.Video_CaptureDevice_CrossBar_GetInputsForOutput_GetItem('Video Decoder', i); cbCrossbarVideoInput.Items.Add(inputName); end; // Select the first item by default if available if cbCrossbarVideoInput.Items.Count > 0 then cbCrossbarVideoInput.ItemIndex := 0; ``` #### C++ MFC Implementation ```cpp // Clear any existing connections and UI elements m_videoCapture.Video_CaptureDevice_CrossBar_ClearConnections(); m_comboVideoInputs.ResetContent(); // Get count of available inputs for the "Video Decoder" output int inputCount = m_videoCapture.Video_CaptureDevice_CrossBar_GetInputsForOutput_GetCount(_T("Video Decoder")); // Populate UI with available inputs for (int i = 0; i < inputCount; i++) { CString inputName = m_videoCapture.Video_CaptureDevice_CrossBar_GetInputsForOutput_GetItem(_T("Video Decoder"), i); m_comboVideoInputs.AddString(inputName); } // Select the first item by default if available if (m_comboVideoInputs.GetCount() > 0) { m_comboVideoInputs.SetCurSel(0); } ``` #### VB6 Implementation ```vb ' Clear any existing connections and UI elements VideoCapture1.Video_CaptureDevice_CrossBar_ClearConnections cboVideoInputs.Clear ' Get count of available inputs for the "Video Decoder" output Dim inputCount As Integer inputCount = VideoCapture1.Video_CaptureDevice_CrossBar_GetInputsForOutput_GetCount("Video Decoder") ' Populate UI with available inputs Dim i As Integer Dim inputName As String For i = 0 To inputCount - 1 inputName = VideoCapture1.Video_CaptureDevice_CrossBar_GetInputsForOutput_GetItem("Video Decoder", i) cboVideoInputs.AddItem inputName Next i ' Select the first item by default if available If cboVideoInputs.ListCount > 0 Then cboVideoInputs.ListIndex = 0 End If ``` Common input types you might encounter include: - Composite - S-Video - HDMI - Component - TV Tuner The exact list depends on your specific capture hardware capabilities. ### Step 3: Apply the Selected Input After the user selects their desired input source, you need to apply this selection by establishing a connection between the selected input and the video decoder output. #### Delphi Implementation ```pascal // First clear any existing connections VideoCapture1.Video_CaptureDevice_CrossBar_ClearConnections; // Connect the selected input to the "Video Decoder" output // Parameters: input name, output name, automatic signal routing if cbCrossbarVideoInput.ItemIndex >= 0 then begin var selectedInput: String := cbCrossbarVideoInput.Items[cbCrossbarVideoInput.ItemIndex]; var success: Boolean := VideoCapture1.Video_CaptureDevice_CrossBar_Connect(selectedInput, 'Video Decoder', true); if success then ShowMessage('Successfully connected ' + selectedInput + ' to Video Decoder') else ShowMessage('Failed to establish connection'); end; ``` #### C++ MFC Implementation ```cpp // First clear any existing connections m_videoCapture.Video_CaptureDevice_CrossBar_ClearConnections(); // Connect the selected input to the "Video Decoder" output // Parameters: input name, output name, automatic signal routing int selectedIndex = m_comboVideoInputs.GetCurSel(); if (selectedIndex >= 0) { CString selectedInput; m_comboVideoInputs.GetLBText(selectedIndex, selectedInput); BOOL success = m_videoCapture.Video_CaptureDevice_CrossBar_Connect( selectedInput, _T("Video Decoder"), TRUE); if (success) { CString msg; msg.Format(_T("Successfully connected %s to Video Decoder"), selectedInput); AfxMessageBox(msg); } else { AfxMessageBox(_T("Failed to establish connection")); } } ``` #### VB6 Implementation ```vb ' First clear any existing connections VideoCapture1.Video_CaptureDevice_CrossBar_ClearConnections ' Connect the selected input to the "Video Decoder" output ' Parameters: input name, output name, automatic signal routing If cboVideoInputs.ListIndex >= 0 Then Dim selectedInput As String selectedInput = cboVideoInputs.Text Dim success As Boolean success = VideoCapture1.Video_CaptureDevice_CrossBar_Connect(selectedInput, "Video Decoder", True) If success Then MsgBox "Successfully connected " & selectedInput & " to Video Decoder" Else MsgBox "Failed to establish connection" End If End If ``` The third parameter (`true`) enables automatic signal routing, which helps handle complex connection scenarios where intermediate routing might be required. ## Best Practices for Crossbar Implementation For robust video input selection in your applications: 1. **Always initialize the crossbar first**: Check for availability before attempting operations 2. **Clear existing connections**: Before setting a new connection, clear any existing ones 3. **Handle missing crossbar gracefully**: Provide fallback options when crossbar functionality isn't available 4. **Validate selections**: Ensure a valid input is selected before attempting to establish connections 5. **Provide user feedback**: Inform users about successful or failed connection attempts ## Troubleshooting Common Issues If you encounter problems with crossbar connections: - Verify your capture device supports crossbar functionality - Check that input and output names match exactly what the device reports - Ensure proper device driver installation - Use debug logging to track connection attempts - Test with different input sources to isolate hardware-specific issues ## Conclusion Proper implementation of crossbar technology in your video capture applications gives users the flexibility to work with multiple input sources. By following the steps outlined in this guide, you can create a robust and user-friendly video input selection system for your applications regardless of whether you're developing in Delphi, C++ MFC, or Visual Basic 6. The code samples provided demonstrate how to initialize the crossbar, enumerate available inputs, and connect selected inputs to the video decoder output. With these fundamentals in place, you can build sophisticated video capture applications that support a wide range of input devices and connection types. --- For additional assistance with implementing this functionality, explore our other documentation pages and code samples repository for more advanced techniques and solutions. ---END OF PAGE--- # Local File: .\delphi\videocapture\video-renderer.md --- title: Video Renderer Options for Delphi Video Capture description: Implement optimal video renderers in your Delphi applications with this developer guide. Learn how to use Video Renderer, VMR9, and EVR with detailed code examples for better performance, hardware acceleration, and compatibility across different Windows environments. sidebar_label: Select video renderer --- # Video Renderer Selection Guide for TVFVideoCapture ## Overview of Available Renderers When developing video capture applications with TVFVideoCapture, selecting the appropriate video renderer significantly impacts performance and compatibility. This guide provides detailed implementation examples for the three available renderer options in Delphi, C++, and VB6 environments. ## Standard Video Renderer The standard Video Renderer utilizes GDI for drawing operations. This renderer option is primarily recommended for: - Legacy systems - Environments where Direct3D acceleration is unavailable - Maximum compatibility with older hardware ```pascal // Delphi VideoCapture1.Video_Renderer := VR_VideoRenderer; ``` ```cpp // C++ MFC m_VideoCapture.SetVideo_Renderer(VR_VideoRenderer); ``` ```vb ' VB6 VideoCapture1.Video_Renderer = VR_VideoRenderer ``` ## Video Mixing Renderer 9 (VMR9) VMR9 represents a modern filtering solution capable of leveraging GPU capabilities for enhanced rendering. Key advantages include: - Hardware-accelerated video processing - Advanced deinterlacing options - Improved performance for high-resolution content ```pascal // Delphi VideoCapture1.Video_Renderer := VR_VMR9; ``` ```cpp // C++ MFC m_VideoCapture.SetVideo_Renderer(VR_VMR9); ``` ```vb ' VB6 VideoCapture1.Video_Renderer = VR_VMR9 ``` ### Accessing Deinterlacing Modes VMR9 supports multiple deinterlacing techniques. The following code demonstrates how to retrieve available deinterlacing options: ```pascal // Delphi VideoCapture1.Video_Renderer_Deinterlace_Modes_Fill; for I := 0 to VideoCapture1.Video_Renderer_Deinterlace_Modes_GetCount - 1 do cbDeinterlaceModes.Items.Add(VideoCapture1.Video_Renderer_Deinterlace_Modes_GetItem(i)); ``` ```cpp // C++ MFC m_VideoCapture.Video_Renderer_Deinterlace_Modes_Fill(); for (int i = 0; i < m_VideoCapture.GetVideo_Renderer_Deinterlace_Modes_GetCount(); i++) { m_DeinterlaceCombo.AddString(m_VideoCapture.GetVideo_Renderer_Deinterlace_Modes_GetItem(i)); } ``` ```vb ' VB6 VideoCapture1.Video_Renderer_Deinterlace_Modes_Fill For i = 0 To VideoCapture1.Video_Renderer_Deinterlace_Modes_GetCount - 1 cboDeinterlaceModes.AddItem VideoCapture1.Video_Renderer_Deinterlace_Modes_GetItem(i) Next i ``` ## Enhanced Video Renderer (EVR) EVR is the recommended renderer for modern Windows environments (Vista and later). This advanced renderer provides: - Superior video acceleration capabilities - Optimal performance on Windows 7/10/11 - Better resource utilization ```pascal // Delphi VideoCapture1.Video_Renderer := VR_EVR; ``` ```cpp // C++ MFC m_VideoCapture.SetVideo_Renderer(VR_EVR); ``` ```vb ' VB6 VideoCapture1.Video_Renderer = VR_EVR ``` ## Managing Aspect Ratio and Display Options When displaying video content, you'll often need to handle aspect ratio differences between the source video and the display area. ### Stretching the Video Image To stretch the video to fill the entire display area: ```pascal // Delphi VideoCapture1.Screen_Stretch := true; VideoCapture1.Screen_Update; ``` ```cpp // C++ MFC m_VideoCapture.SetScreen_Stretch(true); m_VideoCapture.Screen_Update(); ``` ```vb ' VB6 VideoCapture1.Screen_Stretch = True VideoCapture1.Screen_Update ``` ### Using Letterbox Mode (Black Borders) For preserving the original aspect ratio with black borders: ```pascal // Delphi VideoCapture1.Screen_Stretch := false; VideoCapture1.Screen_Update; ``` ```cpp // C++ MFC m_VideoCapture.SetScreen_Stretch(false); m_VideoCapture.Screen_Update(); ``` ```vb ' VB6 VideoCapture1.Screen_Stretch = False VideoCapture1.Screen_Update ``` ## Performance Considerations When selecting a renderer for your application, consider these factors: 1. Target operating system version 2. Hardware capabilities of end-user systems 3. Video resolution and processing requirements 4. Compatibility needs for your deployment environment --- Please get in touch with [support](https://support.visioforge.com/) if you need technical assistance with this implementation. Visit our [GitHub](https://github.com/visioforge/) repository for additional code samples and resources. ---END OF PAGE--- # Local File: .\delphi\videocapture\install\builder.md --- title: TVFVideoCapture Integration for C++ Builder description: Complete step-by-step guide for Delphi developers on installing and configuring TVFVideoCapture ActiveX control in C++ Builder environments. Learn implementation techniques across Builder versions 5/6, 2006, and newer releases. sidebar_label: C++ Builder --- # TVFVideoCapture Integration Guide for C++ Builder This detailed installation guide walks you through the process of integrating the powerful TVFVideoCapture ActiveX control with your C++ Builder projects. We've provided separate instructions for different C++ Builder versions to ensure seamless implementation regardless of your development environment. > Related products: [All-in-One Media Framework (Delphi / ActiveX)](https://www.visioforge.com/all-in-one-media-framework) ## Installation in Borland C++ Builder 5/6 Follow these detailed steps to properly install the TVFVideoCapture control in Borland C++ Builder 5/6: 1. Navigate to the main menu and select **Component → Import ActiveX Controls** ![Screenshot showing the Component menu with Import ActiveX Controls option](vcbcb5_1.webp) 2. From the available controls list, locate and select the **VisioForge Video Capture** item 3. Click the **Install** button to begin importing the ActiveX control ![Screenshot showing the ActiveX control selection dialog](vcbcb5_2.webp) 4. When prompted for confirmation, click the **Yes** button to proceed ![Screenshot showing the confirmation dialog](vcbcb5_3.webp) 5. Once the installation process completes successfully, you'll see a confirmation message 6. Click the **OK** button to finalize the installation ![Screenshot showing the successful installation message](vcbcb5_4.webp) ## Installation in C++ Builder 2006 and Later Versions For more recent versions of C++ Builder (2006 and newer), follow this expanded installation process: ### Step 1: Create a New Package Begin by creating a new package that will contain the TVFVideoCapture control ![Screenshot showing the new package creation dialog](vcbcb2006_4.webp) ### Step 2: Import the ActiveX Component 1. From the main menu, select **Component → Import Component** ![Screenshot showing the Component menu with Import Component option](vcbcb2006_2.webp) 2. In the dialog that appears, select the **Import ActiveX Control** radio button 3. Click the **Next** button to continue ![Screenshot showing the import type selection dialog](vcbcb2006_3.webp) ### Step 3: Select the TVFVideoCapture Control 1. Browse through the available ActiveX controls 2. Locate and select the **VisioForge Video Capture** item from the list 3. Click the **Next** button to proceed ![Screenshot showing the ActiveX control selection dialog](vcbcb2006_5.webp) ### Step 4: Configure Output Settings 1. Specify the desired package output folder for the component files 2. Click the **Next** button after selecting an appropriate location ![Screenshot showing the output folder selection dialog](vcbcb2006_5-1.webp) ### Step 5: Add Component to Package 1. Ensure the **Add unit to…** radio button is selected 2. Click the **Finish** button to complete the import process ![Screenshot showing the final import configuration dialog](vcbcb2006_6.webp) ### Step 6: Save and Install the Package 1. Save your project when prompted ![Screenshot showing the save project dialog](vcbcb2006_7.webp) 2. Install the package to make the component available in your development environment ![Screenshot showing the package installation dialog](vcbcb2006_8.webp) 3. Verify that the TVFVideoCapture ActiveX control has been successfully installed ![Screenshot showing the successful installation confirmation](vcbcb2006_9.webp) ## Additional Resources and Support After completing the installation, you can begin using the TVFVideoCapture control in your applications. The component provides extensive functionality for video capture and processing operations. For developers looking to explore additional implementation examples and techniques: - Access our [GitHub repository](https://github.com/visioforge/) for code samples and example projects - Contact our [technical support team](https://support.visioforge.com/) for personalized assistance with integration challenges - Review our documentation for detailed API references and advanced usage scenarios By following this installation guide, you'll have successfully integrated the TVFVideoCapture ActiveX control into your C++ Builder development environment, enabling powerful video capture capabilities in your applications. ---END OF PAGE--- # Local File: .\delphi\videocapture\install\delphi.md --- title: TVFVideoCapture Installation Guide for Delphi description: Complete step-by-step instructions for installing the TVFVideoCapture library in various Delphi versions. This developer guide covers installation processes for Delphi 6/7 through Delphi 11+, including troubleshooting common issues and configuration requirements. sidebar_label: Delphi --- # Comprehensive TVFVideoCapture Installation Guide for Delphi Developers > Related products: [All-in-One Media Framework (Delphi / ActiveX)](https://www.visioforge.com/all-in-one-media-framework) ## Installation in Borland Delphi 6/7 The installation process for legacy Delphi 6/7 environments requires several specific steps to ensure proper integration of the TVFVideoCapture library. ### Step 1: Create a New Package Begin by creating a new package in your Delphi 6/7 development environment. ![Creating a new package in Delphi 6/7](vcd6_1.webp) ### Step 2: Configure Library Paths Add the TVFVideoCapture source directory to both the library and browser path settings. This allows Delphi to locate the necessary component files. ![Adding source directory to library paths](vcd6_2.webp) ### Step 3: Open the Library Package Navigate to and open the library package file to prepare for installation. ![Opening the library package](vcd6_3.webp) ### Step 4: Install the Component Package Complete the installation by selecting the install option within the package interface. ![Installing the package](vcd6_4.webp) ![Confirmation of successful installation](vcd6_5.webp) ### Architecture Limitations While TVFVideoCapture offers both x86 and x64 architecture support, Delphi 6/7 only supports x86 due to platform limitations. Developers using these versions will need to utilize the 32-bit implementation exclusively. ## Installation Process for Delphi 2005 and Later Versions Modern Delphi versions offer an improved installation workflow with enhanced capabilities. ### Step 1: Launch Delphi with Administrative Privileges Ensure you run your Delphi IDE with administrative rights to prevent permission-related installation issues. ![Opening Delphi with admin rights](vcd2005_1.webp) ### Step 2: Access Options Dialog Navigate to the Options menu to configure essential library settings. ![Accessing the Options window](vcd2005_11.webp) ### Step 3: Configure Source Directory Paths Add the TVFVideoCapture source directory to both the library and browser path settings to ensure proper component discovery. ![Configuring source directory paths](vcd2005_2.webp) ### Step 4: Open the Component Library Package Locate and open the library package file included with TVFVideoCapture. ![Opening the component library package](vcd2005_3.webp) ### Step 5: Complete Package Installation Install the package through the IDE's package installation interface. ![Installing the component package](vcd2005_4.webp) ![Verification of successful installation](vcd2005_41.webp) ## Advanced Installation for Delphi 11 and Newer Releases The latest Delphi versions require a slightly different approach that leverages modern project structures. ### Step 1: Locate and Open the Package Project After installing the framework, navigate to the installation folder and open the `.dproj` package file. ### Step 2: Select the Appropriate Build Configuration Choose the Release build configuration to ensure optimal component performance. ![Selecting Release build configuration](delphi11-1.png) ### Step 3: Install the Component Package Complete the installation process through the IDE's package installation interface. ![Installing the component package](delphi11-2.png) ### Step 4: Verify Installation Success Confirm that the installation completed successfully before proceeding with development. ![Verification of successful installation](delphi11-3.png) ## Project Configuration Requirements and Best Practices ### Multi-Architecture Support TVFVideoCapture supports both x86 and x64 architectures, allowing you to develop applications for different platform targets. You can install both package versions simultaneously to support flexible deployment scenarios. ### Library Path Configuration For proper component functionality, ensure that you've configured the correct library folder path in your application project settings. This path should point to the location containing the `.dcu` files for your target architecture. To set this up: 1. Open your project options dialog 2. Navigate to the Library path section 3. Add the appropriate TVFVideoCapture library path 4. Save your project settings This configuration ensures that your application can locate all required component resources during both development and runtime. ## Troubleshooting Common Installation Issues When installing TVFVideoCapture, developers might encounter several known issues. Here are solutions to the most frequent problems: ### 64-bit Package Installation Problems If you're having difficulties installing the 64-bit package version, refer to our [detailed guide for resolving Delphi 64-bit package installation issues](../../general/install-64bit.md). ### Resource File (.otares) Installation Issues Some developers encounter problems related to `.otares` files during package installation. For a step-by-step resolution process, see our [troubleshooting guide for .otares installation problems](../../general/install-otares.md). ## Technical Support and Additional Resources For developers requiring additional assistance with the installation process or component implementation: - Contact our [technical support team](https://support.visioforge.com/) for personalized installation assistance - Visit our [GitHub repository](https://github.com/visioforge/) for additional code samples and implementation examples - Check our documentation for advanced usage scenarios and integration patterns Following this installation guide will ensure that you have a properly configured development environment for creating powerful multimedia applications with TVFVideoCapture in your Delphi projects. ---END OF PAGE--- # Local File: .\delphi\videocapture\install\index.md --- title: TVFVideoCapture Installation Guide for IDEs description: Step-by-step guide for installing TVFVideoCapture library in Delphi, Visual Studio, C++ Builder, and VB6. Learn how to set up the framework, configure packages, and integrate ActiveX components in your development environment. sidebar_label: Installation --- # TVFVideoCapture Installation Guide ## Installation 1. **Download the latest version of the VisioForge All-in-One Media Framework**: Navigate to the [product page](https://www.visioforge.com/all-in-one-media-framework) on our official website and download the most up-to-date version of the TVFVideoCapture library. Ensure that you select the appropriate version that matches your development environment requirements. 2. **Run the setup file**: After the download completes, locate the setup file in your download directory and execute it. This will launch the installation process. 3. **Follow the installation wizard instructions**: The setup wizard will guide you through the installation steps. Carefully read each prompt, accept the license agreement, choose the installation directory, and proceed by clicking "Next". 4. **Completion**: Upon successful installation, go to the installation folder. Here, you will find a variety of framework samples and detailed documentation designed to assist you in integrating and utilizing the library effectively within your projects. ### Delphi packages installation For detailed instructions on installing the TVFVideoCapture packages in your Delphi IDE, please refer to the following [Delphi installation guide](delphi.md). ### ActiveX installation #### C++ Builder For [C++ Builder](builder.md), the installation process involves importing the ActiveX control into your project. This straightforward process ensures that you can quickly start using the TVFVideoCapture library in your C++ Builder projects. #### Visual Basic 6 In [Visual Basic 6](visual-basic-6.md), open your project and go to the "Project" menu. Select "Components" and click "Browse". Find the ActiveX .ocx file in the installation folder and add it to your project. The components will now be available in the toolbox for your VB6 applications. ### Visual Studio 2010 and later For [Visual Studio 2010 and newer](visual-studio.md) versions, open your project in the IDE, right-click on the toolbox, and select "Choose Items". Navigate to the COM components tab, click "Browse", and select the ActiveX .ocx file from the framework installation directory. This will add the components to your toolbox, allowing you to use them in your Visual Studio projects. ## Conclusion By following this comprehensive guide, you should be able to smoothly install and integrate the TVFVideoCapture library into your chosen development environment. Should you encounter any issues or require further assistance, please refer to the detailed documentation included in the framework or contact our support team. ---END OF PAGE--- # Local File: .\delphi\videocapture\install\visual-basic-6.md --- title: Installing TVFVideoCapture in VB6 for Delphi Users description: A detailed step-by-step guide for Delphi developers on integrating the TVFVideoCapture ActiveX control within Visual Basic 6 environments. Learn how to enhance your cross-platform development capabilities with this powerful video capture solution. sidebar_label: Visual Basic 6 --- # Integrating TVFVideoCapture with Visual Basic 6 ## Overview and Compatibility Microsoft Visual Basic 6 offers excellent compatibility with our TVFVideoCapture library through its ActiveX control interface. This integration empowers developers to significantly enhance their applications with advanced video capture capabilities while maintaining optimal performance characteristics. Due to the architecture of Visual Basic 6, which was developed during the early stages of Windows programming frameworks, the platform exclusively supports 32-bit applications. Consequently, only the x86 version of our TVFVideoCapture library is compatible with VB6 development environments. Despite this architectural limitation, our framework delivers exceptional performance within the 32-bit environment. The library provides full access to our comprehensive feature set, ensuring developers can implement sophisticated video capture solutions regardless of the 32-bit constraint. ## Detailed Installation Process The following step-by-step guide will walk you through the complete process of installing and configuring the TVFVideoCapture ActiveX control in your Visual Basic 6 development environment. ### Step 1: Create a New Project Environment Begin by launching Visual Basic 6 and creating a new standard project that will serve as the foundation for your video capture implementation. ![Creating a new VB6 project](vcvb6_1.webp) ### Step 2: Access the Components Dialog Navigate to the Project menu and select the "Components" option to open the component selection dialog. This interface allows you to browse and select from available ActiveX controls. ![Opening the Components dialog](vcvb6_2.webp) ### Step 3: Select the TVFVideoCapture Component In the Components dialog, scroll through the available controls and locate the "VisioForge Video Capture" item. Check the box next to it to include this component in your toolbox. ![Selecting the VisioForge Video Capture component](vcvb6_3.webp) ### Step 4: Verify Successful Integration After adding the component, you should notice the new TVFVideoCapture control appearing in your VB6 toolbox. This confirms that the ActiveX control has been successfully integrated into your development environment. ![Verification of the added control](vcvb6_4.webp) ## Implementation Considerations When implementing the TVFVideoCapture control in your VB6 application, consider the following best practices: - Initialize the control early in your application lifecycle - Configure capture parameters before starting the capture process - Implement proper error handling for device connectivity issues - Release resources when they are no longer needed ## Technical Support and Additional Resources --- For technical questions or implementation challenges, please contact our [support team](https://support.visioforge.com/) who specialize in assisting developers with integration requirements. For additional code examples and implementation patterns, visit our [GitHub repository](https://github.com/visioforge/) which contains numerous samples demonstrating optimal usage patterns. ---END OF PAGE--- # Local File: .\delphi\videocapture\install\visual-studio.md --- title: TVFVideoCapture Integration for Delphi in VS description: Complete step-by-step guide for Delphi developers on installing and configuring TVFVideoCapture ActiveX controls in Visual Studio environments. Learn how to leverage video capture functionality in your development projects. sidebar_label: Visual Studio 2010 and later --- # Installing TVFVideoCapture in Visual Studio 2010 and Later ## Overview of TVFVideoCapture Integration The TVFVideoCapture ActiveX control provides powerful video capture capabilities for your development projects. This guide walks you through the installation process in Visual Studio environments, with special considerations for Delphi developers. ## Installation Requirements Before beginning the installation process, ensure you have: - Visual Studio 2010 or a later version installed - Administrator rights on your development machine - Both x86 and x64 ActiveX controls registered (if applicable) ## Installation Process for Different Project Types You can implement the TVFVideoCapture ActiveX control directly in various project types. The integration approach differs slightly depending on your development environment: ### For C++ Projects In C++ projects, you can use the ActiveX control directly without additional wrappers or interfaces. ### For C#/VB.Net Projects When working with C# or Visual Basic .NET projects, Visual Studio automatically generates a custom wrapper assembly. This wrapper exposes the ActiveX API through managed code, making integration seamless. ## Step-by-Step Installation Guide Follow these detailed steps to install the TVFVideoCapture control in your Visual Studio environment: 1. Create a new project in your preferred language (C++, C#, or Visual Basic .NET) 2. Access the toolbox panel in your Visual Studio interface ![Opening the toolbox](vcvs_1.webp) 3. Right-click on the toolbox and select "Choose toolbox items" from the context menu ![Accessing toolbox items dialog](vcvs_2.webp) 4. In the dialog box that appears, locate and select the "VisioForge Video Capture" component ![Selecting the video capture component](vcvs_3.webp) 5. After selection, the control will be added to your toolbox for easy access ![Control added to toolbox](vcvs_4.webp) 6. Add the control to your form by dragging it from the toolbox 7. For .NET projects, Visual Studio will automatically generate the necessary wrapper assembly ## Framework Samples and Resources For practical implementation examples, refer to the framework samples included with your installation package. These samples cover all supported programming languages and demonstrate various integration scenarios. ## Recommendations for .NET Developers While ActiveX integration is fully supported, .NET developers may benefit from using the native .NET version of the SDK. The native implementation offers: - Enhanced performance and stability - Direct integration with WinForms and WPF - MAUI control support for cross-platform development - More intuitive API design for .NET environments ## Additional Resources and Support Explore our extensive documentation for advanced configuration options and optimization techniques. Our development team continuously updates resources to address common implementation challenges. --- For technical assistance with this installation process, please contact our [support team](https://support.visioforge.com/). Additional code samples and implementation examples are available on our [GitHub repository](https://github.com/visioforge/). ---END OF PAGE--- # Local File: .\delphi\videoedit\changelog.md --- title: TVFVideoEdit Library Version History & Updates description: Detailed version history of the TVFVideoEdit library for Delphi and ActiveX developers. Explore all features, bug fixes, performance improvements, and compatibility updates from versions 2.1 through 10.0, including Windows 8 support, FFMPEG integration, and advanced video effects. sidebar_label: Changelog --- # TVFVideoEdit Library: Complete Version History ## Version 10.0 - Latest Release ### Core Improvements - **Enhanced Media Compatibility**: Added dedicated MP3 splitter component to resolve playback issues with problematic MP3 files that fail with the default splitter - **Audio Processing Enhancements**: Significantly improved information extraction and metadata reading for Speex audio files - **Performance Optimization**: Fixed critical memory leak in FFMPEG source implementation for better resource management - **Expanded Format Support**: YUV2RGB filter now fully supports HDYC format for professional video workflows ## Version 8.7 - Engine Updates ### Technical Enhancements - **VLC Integration**: Updated VLC engine to latest stable release (libVLC 2.2.1.0) for improved codec support - **Decoding Capabilities**: Implemented latest FFMPEG engine version with expanded format compatibility ## Version 8.6 - Stability Improvements ### Bug Fixes & Additions - **Memory Management**: Resolved critical memory leak affecting long-running applications - **File Handling**: Fixed issues with incorrectly closed input and output files that caused resource locking - **WebM Support**: Added new high-performance WebM filters based on the official WebM project specifications ## Version 8.4 - Platform Expansion ### Development Environment Support - **Modern Delphi**: Added full Delphi XE8 integration and compatibility - **Architecture Expansion**: Introduced both Delphi and ActiveX 64-bit (x64) implementations ## Version 8.3.1 - Compatibility Update ### Development Tools - **IDE Support**: Added complete Delphi XE7 compatibility and integration ## Version 8.3 - Performance Release ### Core Improvements - **Decoder Update**: Substantially improved FFMPEG decoder implementation - **Stability**: Fixed multiple bugs affecting reliability and performance ## Version 8.0 - Major Engine Upgrade ### Key Features - **Playback Architecture**: Implemented VLC engine for enhanced video/audio file playback capabilities - **Reliability**: Resolved several critical bugs affecting performance ## Version 7.15 - Security Features ### Media Protection - **Content Security**: Added encrypted video file playback functionality - **Stability**: Implemented minor bug fixes for improved reliability ## Version 7.2 - Effects & Performance ### Visual Enhancements - **FFMPEG Implementation**: Updated FFMPEG decoder for better format support - **Video Effects**: Added professional pan/zoom video effect capabilities - **Reliability**: Fixed minor bugs for improved stability ## Version 7.0 - Windows 8 & FFMPEG ### Platform Support - **Operating System**: Added full Windows 8 RTM support - **Media Handling**: Integrated comprehensive FFMPEG decoding capabilities - **Visual Effects**: Substantially improved video effects processing quality ## Version 6.0 - Windows 8 Preview ### Platform Expansion - **Early Adoption**: Added Windows 8 Developer Preview compatibility - **Visual Processing**: Enhanced quality and performance of video effects ## Version 3.4 - Maintenance Release ### Stability Improvements - **Bug Fixes**: Resolved multiple issues affecting reliability ## Version 3.3 - Delphi XE2 Support ### Developer Tools - **IDE Compatibility**: Added full Delphi XE2 support and integration - **Stability**: Implemented various bug fixes for improved reliability ## Version 3.2 - Effects & Demos ### Enhanced Capabilities - **Visual Effects**: Significantly improved video effects processing - **Developer Resources**: Added additional demo applications for easier implementation ## Version 3.1 - Effects Upgrade ### Visual Processing - **Effects Engine**: Enhanced video effects processing capabilities - **Stability**: Fixed multiple bugs for improved reliability ## Version 3.0 - Feature Expansion ### Major Enhancements - **Effects System**: Substantially improved effects filter functionality - **Streaming**: Added MMS / WMV stream playback support - **Video Analysis**: Implemented motion detection capabilities - **Compositing**: Added professional chroma-key functionality - **Core Performance**: Significantly improved underlying engine ## Version 2.2 - Effects Update ### Visual Processing - **Effects Quality**: Enhanced effects filter implementation for better visual results ## Version 2.1 - Initial Effects ### First Implementations - **Visual Processing**: Introduced initial effects filter capabilities ---END OF PAGE--- # Local File: .\delphi\videoedit\deployment.md --- title: TVFVideoEdit Library Deployment Guide description: Learn how to deploy the TVFVideoEdit library in your Delphi and ActiveX applications. This step-by-step guide covers installation options, required components, and troubleshooting for developers implementing video editing functionality. sidebar_label: Deployment --- # TVFVideoEdit Library Deployment Guide ## Introduction The TVFVideoEdit library provides powerful video editing capabilities for your Delphi and ActiveX applications. This guide explains how to properly deploy all necessary components to ensure your application functions correctly on end-user systems without requiring the full development framework. ## Deployment Options You have two primary methods for deploying the TVFVideoEdit library components: automatic installers or manual installation. Each approach has specific advantages depending on your distribution requirements. ### Automatic Silent Installers For streamlined deployment, we offer silent installer packages that handle all necessary component installation without user interaction: #### Required Base Package * **Base components** (always required): * [Delphi version](http://files.visioforge.com/redists_delphi/redist_video_edit_base_delphi.exe) * [ActiveX version](http://files.visioforge.com/redists_delphi/redist_video_edit_base_ax.exe) #### Optional Feature Packages * **FFMPEG package** (required for file and IP camera support (only for FFMPEG source engine)): * [x86 architecture](http://files.visioforge.com/redists_delphi/redist_video_edit_ffmpeg.exe) * **MP4 output package** (for MP4 video creation): * [x86 architecture](http://files.visioforge.com/redists_delphi/redist_video_edit_mp4.exe) ### Manual Installation Process For situations where you need precise control over component deployment, follow these detailed steps: 1. **Install Visual C++ Dependencies** * Install VC++ 2010 SP1 redistributable: * [x86 version](http://files.visioforge.com/shared/vcredist_2010_x86.exe) * [x64 version](http://files.visioforge.com/shared/vcredist_2010_x64.exe) 2. **Deploy Core Media Foundation Components** * Copy all MFP DLLs from the `Redist\Filters` directory to your application folder 3. **Register DirectShow Filters** * Copy and COM-register these essential DirectShow filters using [regsvr32.exe](https://support.microsoft.com/en-us/help/249873/how-to-use-the-regsvr32-tool-and-troubleshoot-regsvr32-error-messages): * `VisioForge_Audio_Effects_4.ax` * `VisioForge_Dump.ax` * `VisioForge_RGB2YUV.ax` * `VisioForge_Screen_Capture.ax` * `VisioForge_Video_Effects_Pro.ax` * `VisioForge_Video_Mixer.ax` * `VisioForge_Video_Resize.ax` * `VisioForge_WavDest.ax` * `VisioForge_YUV2RGB.ax` * `VisioForge_FFMPEG_Source.ax` 4. **Configure Path Settings** * Add the folder containing these filters to the system environment variable `PATH` if your application executable resides in a different directory ## Additional Components Installation ### FFMPEG Integration To enable advanced media format support: * Copy all files from the `Redist\FFMPEG` folder * Add this folder to the Windows system `PATH` variable * Register all .ax files from the `Redist\FFMPEG` folder ### VLC Support For extended format compatibility: * Copy all files from the `Redist\VLC` folder * COM-register the .ax file using regsvr32.exe * Create an environment variable named `VLC_PLUGIN_PATH` * Set its value to point to the `VLC\plugins` folder ### Audio Output Support For MP3 encoding capabilities: * Copy the lame.ax file from the `Redist\Formats` folder * Register the lame.ax file using regsvr32.exe ### WebM Format Support For WebM encoding and decoding: * Install the necessary free codecs available from the [xiph.org website](https://www.xiph.org/dshow/) ### Matroska Container Support For MKV format compatibility: * Install [Haali Matroska Splitter](http://haali.su/mkv/) for proper encoding and decoding ### MP4 H264/AAC Output - Modern Encoder For high-quality MP4 creation with modern codecs: * Copy `libmfxsw32.dll` / `libmfxsw64.dll` files * Register these DirectShow filters: * `VisioForge_H264_Encoder.ax` * `VisioForge_MP4_Muxer.ax` * `VisioForge_AAC_Encoder.ax` * `VisioForge_Video_Resize.ax` ### MP4 H264/AAC Output - Legacy Encoder For compatibility with older systems: * Copy `libmfxxp32.dll` / `libmfxxp64.dll` files * Register these DirectShow filters: * `VisioForge_H264_Encoder_XP.ax` * `VisioForge_MP4_Muxer_XP.ax` * `VisioForge_AAC_Encoder_XP.ax` * `VisioForge_Video_Resize.ax` ## Bulk Registration Utility To simplify the registration process for multiple DirectShow filters: * Place the `reg_special.exe` utility from the redistributable package into the folder containing your filters * Run it with administrator privileges to register all compatible filters in that directory ## Troubleshooting Tips Common issues during deployment often include: * Missing dependencies * Incorrect registration of COM components * Path configuration problems * Insufficient user permissions Ensure all required files are properly deployed and registered before launching your application. --- Please contact [our support team](https://support.visioforge.com/) if you encounter any issues with this deployment process. Visit our [GitHub repository](https://github.com/visioforge/) for additional code samples and implementation examples. ---END OF PAGE--- # Local File: .\delphi\videoedit\index.md --- title: TVFVideoEdit Library for Delphi Developers description: Powerful video editing library for Delphi applications. Learn how to build advanced video editing software with TVFVideoEdit - supporting multiple formats, effects, transitions, timeline editing, motion detection, encryption, and audio processing capabilities. sidebar_label: TVFVideoEdit --- # TVFVideoEdit for Delphi / ActiveX Development ## Introduction to TVFVideoEdit The TVFVideoEdit library empowers Delphi developers to integrate sophisticated video editing functionality into their applications. This robust SDK provides a complete framework for handling diverse media operations while maintaining excellent performance and stability across projects of varying complexity. ## Core Capabilities ### Format Support TVFVideoEdit accommodates a wide array of video and audio formats, enabling seamless work with most industry-standard media types. This extensive compatibility ensures your application can process virtually any file users might import. ### Video Processing The library excels in fundamental video manipulation tasks, offering precise control over: - Resolution adjustment - Frame rate conversion - Aspect ratio modification - Color correction tools - Quality enhancement algorithms ### Effects and Transitions Enhance your application with: - Professional visual effects - Smooth transitions between clips - Custom animation capabilities - Real-time preview functionality - Text overlay with font control - Image compositing - Watermarking capabilities - Custom graphic insertion #### Audio Processing Create complete multimedia solutions with: - Volume normalization - Equalization tools - Audio effects application - Voice enhancement options ## Technical Advantages ### Output Flexibility Export finished projects in multiple formats with customizable quality settings to meet specific distribution requirements. ### Timeline Precision The timeline-based editing framework gives developers granular control over media positioning, transitions, and effects timing. ### Multi-Platform Compatibility Deploy your video editing solutions across different Windows environments with consistent performance and reliability. ### Distribution Rights TVFVideoEdit supports royalty-free distribution, making it a cost-effective solution for commercial software development. ## Resources for Developers Accelerate your development with these valuable resources: - [Product Information](https://www.visioforge.com/all-in-one-media-framework) - [API Documentation](https://api.visioforge.com/delphi/video_edit_sdk/index.html) - [Changelog](changelog.md) - [Installation Guide](install/index.md) - [Deployment Instructions](deployment.md) - [License Agreement](../../eula.md) ---END OF PAGE--- # Local File: .\delphi\videoedit\install\builder.md --- title: TVFVideoEdit Integration for C++ Builder description: Step-by-step guide for installing and configuring TVFVideoEdit ActiveX components in all versions of C++ Builder (5/6, 2006, and newer). Learn how to import controls, create packages, and add features to your media applications. sidebar_label: C++ Builder --- # Complete Guide to TVFVideoEdit Installation in C++ Builder > Related products: [VisioForge All-in-One Media Framework (Delphi / ActiveX)](https://www.visioforge.com/all-in-one-media-framework) ## Introduction to TVFVideoEdit for C++ Builder The TVFVideoEdit library provides powerful media processing capabilities for C++ Builder applications. This guide walks you through the installation process across different C++ Builder versions. Before you begin development, you'll need to properly install the ActiveX control into your IDE environment where it will be accessible through the component palette. ## Installation Process for Borland C++ Builder 5/6 ### Accessing the Import Menu Begin the installation process by navigating to the Component menu in your IDE: 1. Launch your Borland C++ Builder 5/6 environment 2. From the main menu, select **Component -> Import ActiveX Controls** ![Screenshot showing Component menu and Import ActiveX Controls option](bcb6_1.webp) ### Selecting the Video Edit Control In the Import ActiveX Control dialog: 1. Locate and select the **"VisioForge Video Edit Control"** from the available controls list 2. Click the **Install** button to begin the import process ![Screenshot showing ActiveX control selection dialog](bcb6_2.webp) ### Confirming Installation The system will prompt you to confirm the installation: 1. A confirmation dialog will appear 2. Click the **Yes** button to proceed with the installation ![Screenshot showing installation confirmation dialog](bcb6_3.webp) ### Verifying Successful Installation After installation completes: 1. The control will be added to your component palette 2. You can now use it in your C++ Builder projects ![Screenshot showing successful installation with component in palette](bcb6_4.webp) ## Installation Guide for C++ Builder 2006 and Later Versions Modern versions of C++ Builder require a different installation approach using packages. ### Creating a New Package First, you'll need to create a package for the component: 1. Open C++ Builder 2006 or later 2. Select **File -> New -> Package** 3. This will create the foundation for adding the ActiveX control ![Screenshot showing new package creation dialog](bcb2006_1-1.webp) ### Importing the ActiveX Component Next, import the ActiveX control into your environment: 1. Navigate to **Component → Import Component** in the main menu 2. This opens the import wizard for adding new components ![Screenshot showing the Component Import menu option](vcbcb2006_2.webp) ### Selecting Import Type In the import wizard: 1. Select the **Import ActiveX Control** radio button option 2. Click the **Next** button to proceed to component selection ![Screenshot showing import type selection dialog](bcb2006_3-1.webp) ### Choosing the Video Edit Control From the available ActiveX controls: 1. Find and select the **"VisioForge Video Edit 5 Control"** from the list 2. Click **Next** to continue with the import process ![Screenshot showing ActiveX control selection in newer Builder versions](bcb2006_4-1.webp) ### Configuring Output Location Specify where the component files should be stored: 1. Choose an appropriate package output folder for your development environment 2. Click **Next** to proceed with configuration ![Screenshot showing output folder selection dialog](bcb2006_5-1.webp) ### Finalizing Component Import Complete the import process: 1. Select the **Add unit to…** radio button option 2. Click the **Finish** button to create the component wrapper ![Screenshot showing import finalization dialog](bcb2006_6-1.webp) ### Saving the Package Project After import completion: 1. The system will prompt you to save your package project 2. Choose an appropriate location and name for your package files ![Screenshot showing package save dialog](bcb2006_7-1.webp) ### Installing the Component Package To make the component available in the IDE: 1. Right-click on the package in the Project Manager 2. Select **Install** from the context menu 3. This compiles and registers the package with the IDE ![Screenshot showing package installation option](bcb2006_8-1.webp) ### Verification and Usage Once installed: 1. The TVFVideoEdit control appears in your component palette 2. It's now ready to use in your C++ Builder applications 3. You can drag and drop it onto forms just like native components ![Screenshot showing successfully installed component in palette](bcb2006_9-1.webp) ## Additional Resources and Support ### Getting Help with Implementation If you encounter any issues during installation or implementation: 1. Our technical support team is available to assist 2. Contact [support](https://support.visioforge.com/) with specific questions 3. Provide details about your Builder version and installation environment ### Code Examples and Documentation To accelerate your development process: 1. Visit our [GitHub repository](https://github.com/visioforge/) for code samples 2. Find implementation examples for common media processing tasks 3. Access additional documentation on component features and usage ## Troubleshooting Common Installation Issues When installing the TVFVideoEdit component, developers may encounter several common issues: 1. **Missing Dependencies**: Ensure all required dependencies are installed 2. **Registration Problems**: Verify ActiveX registration status in Windows registry 3. **IDE Compatibility**: Check compatibility between component and Builder version 4. **Package Conflicts**: Resolve any conflicts with existing packages By following this detailed guide, you'll have TVFVideoEdit successfully integrated into your C++ Builder environment and ready for implementing advanced media functionality in your applications. ---END OF PAGE--- # Local File: .\delphi\videoedit\install\delphi.md --- title: TVFVideoEdit Installation Guide for Delphi Developers description: Step-by-step instructions for installing and configuring TVFVideoEdit library in various Delphi versions (6/7, 2005+, 11+). Learn how to properly set up library paths, build packages, and troubleshoot common installation issues. sidebar_label: Borland Delphi --- # TVFVideoEdit Installation Guide for Delphi Developers > Related products: [VisioForge All-in-One Media Framework (Delphi / ActiveX)](https://www.visioforge.com/all-in-one-media-framework) ## Installation Requirements Before beginning the installation process, ensure that you have: 1. Appropriate Delphi version installed and properly configured 2. Administrative rights for package installation 3. Downloaded the latest version of the TVFVideoEdit library ## Installing in Borland Delphi 6/7 ### Step 1: Configure Library Paths Begin by opening the "Options" window in your Delphi IDE. ![Screenshot showing how to open Options window](ved6_1.webp) Navigate to the Library section and add the source directory to both the library and browser paths. This ensures that Delphi can locate the necessary files. ![Screenshot showing library path configuration](ved6_2.webp) ### Step 2: Open and Install the Package Locate and open the main package file from the library. ![Screenshot showing how to open the package](ved6_3.webp) Install the package by clicking the Install button in the IDE. This registers the components with Delphi's component palette. ![Screenshot showing install button location](ved6_4.webp) ![Screenshot showing successful installation](ved6_5.webp) ### Architecture Considerations The library includes both x86 and x64 architecture versions. However, for Delphi 6/7, you must use the x86 version as these Delphi versions do not support 64-bit development. ## Installing in Delphi 2005 and Later ### Step 1: Launch with Administrative Privileges For Delphi 2005 and later versions, launch the IDE with administrative rights to ensure proper installation permissions. ![Screenshot showing Delphi 2005 startup](ved2005_1.webp) ![Screenshot showing Delphi 2005 IDE](ved2005_2.webp) ### Step 2: Configure Library Paths Open the Options window and navigate to the Library section. Add the source directory to both the library and browser paths. ![Screenshot showing library path configuration in Delphi 2005](ved2005_3.webp) ### Step 3: Install the Package Open the main package file from the library source directory. ![Screenshot showing package opening in Delphi 2005](ved2005_4.webp) Click the Install button to register the components with Delphi's component palette. ![Screenshot showing install button location in Delphi 2005](ved2005_5.webp) ![Screenshot showing successful installation in Delphi 2005](ved2005_6.webp) ### Architecture Support For Delphi 2005 and later versions, both x86 and x64 versions are available. You can utilize the 64-bit version if you need to develop 64-bit applications. Note that the IDE itself may require the x86 version for design-time operations. ## Installing in Delphi 11 and Later Modern Delphi versions feature a streamlined installation process: 1. Open the library `.dproj` package file located in the library folder after installation 2. Select the Release build configuration from the dropdown menu 3. Build and install the package using the IDE's build commands 4. The components will be registered and ready to use ## Project Configuration Best Practices You can install both x86 and x64 packages based on your project requirements. Ensure you've properly configured your application's library path settings: 1. Add the correct library folder path to your project options 2. Configure the path to properly locate `.dcu` files 3. Verify architecture compatibility between your project and the installed packages ## Troubleshooting Common Installation Issues If you encounter problems during installation, check these common issues: ### Delphi 64-bit Package Installation Problems Some specific issues can occur when installing 64-bit packages. See our [detailed troubleshooting guide](../../general/install-64bit.md) for solutions. ### Issues with .otares Files Installation problems related to `.otares` files are documented in our [dedicated troubleshooting page](../../general/install-otares.md). ## Additional Resources and Support For additional code examples and implementations, visit our [GitHub repository](https://github.com/visioforge/) where we maintain a collection of sample projects. If you need personalized assistance with installation or implementation, please contact our [technical support team](https://support.visioforge.com/) who can provide guidance specific to your development environment. --- For technical questions or installation assistance with this library, please reach out to our [development support team](https://support.visioforge.com/). Browse additional code samples and resources on our [GitHub](https://github.com/visioforge/) page. ---END OF PAGE--- # Local File: .\delphi\videoedit\install\index.md --- title: TVFVideoEdit installation in IDEs description: How to install TVFVideoEdit in Delphi and other IDEs sidebar_label: Installation --- # TVFVideoEdit library installation guide The library is available as a Delphi package exclusively for Delphi developers, offering tailored functionality and integration. Additionally, the ActiveX control version is versatile and can be used in MFC, VB6, or any other ActiveX-compatible IDEs, providing broad compatibility and flexibility for developers across different platforms. This ensures a robust and adaptable development experience regardless of your preferred environment. ## Installation 1. **Download the latest version of the All-in-One Media Framework**: Visit the [product page](https://www.visioforge.com/all-in-one-media-framework) and download the most recent version of the framework that is suitable for your needs. 2. **Run the setup file**: Once the download is complete, locate the setup file in your download directory and run it. This will start the installation process. 3. **Follow the installation wizard instructions**: The installation wizard will guide you through each step of the process. Carefully read and follow the prompts, accept the license agreement, select the installation directory, and continue by clicking the "Next" button. 4. **Completion**: After the installation is complete, navigate to the installation folder. Here, you will find various library samples and comprehensive documentation designed to assist you in getting started with integrating the library into your projects. ### Delphi packages installation For detailed instructions on installing the TVFVideoEdit packages in your Delphi IDE, please refer to the following [Delphi installation guide](delphi.md). ### ActiveX installation #### C++ Builder For [C++ Builder](builder.md), the installation process involves importing the ActiveX control into your project. This straightforward process ensures that you can quickly start using the TVFVideoEdit library in your C++ Builder projects. #### Visual Basic 6 In [Visual Basic 6](visual-basic-6.md), open your project and go to the "Project" menu. Select "Components" item, then click "Browse" and find the ActiveX .ocx file in the installation folder. Add it to your project to make the components available in your toolbox. #### Visual Studio 2010 and later For [Visual Studio 2010 and newer](visual-studio.md) versions, open your project in the IDE, right-click on the toolbox, and select "Choose Items". Navigate to the COM components tab, click "Browse", and select the ActiveX .ocx file from the framework's installation directory. This will add the components to your toolbox, allowing their use in your Visual Studio projects. ## Conclusion By following this detailed guide, you can successfully install and integrate the TVFVideoEdit library into your preferred development environment. If you encounter any issues or need further assistance, please refer to the documentation provided with the framework or contact our support team for help. ---END OF PAGE--- # Local File: .\delphi\videoedit\install\visual-basic-6.md --- title: Integrating TVFVideoEdit ActiveX in VB6 Projects description: Learn how to successfully install and implement the TVFVideoEdit ActiveX control in Visual Basic 6 development environments. This step-by-step guide shows developers how to enhance their applications with powerful video editing capabilities. sidebar_label: Visual Basic 6 --- # Installing TVFVideoEdit ActiveX Control in Visual Basic 6 ## Introduction Visual Basic 6 remains a popular development environment for creating Windows applications. By leveraging our TVFVideoEdit library as an ActiveX control, developers can incorporate advanced video editing and processing capabilities into their VB6 applications without extensive coding. ## Technical Requirements and Limitations Microsoft Visual Basic 6 operates as a 32-bit development platform and cannot produce 64-bit applications. Due to this architectural constraint, only the x86 (32-bit) version of our library is compatible with VB6 projects. Despite this limitation, the 32-bit implementation delivers excellent performance and provides full access to the library's extensive feature set. ## Installation Process Follow these detailed steps to properly install the TVFVideoEdit ActiveX control in your Visual Basic 6 environment: ### Step 1: Create a New Project Begin by launching Visual Basic 6 and creating a new project: 1. Open Visual Basic 6 IDE 2. Select "New Project" from the File menu 3. Choose "Standard EXE" as the project type 4. Click "OK" to create the baseline project ![Creating a new VB6 project](vevb6_1.webp) ### Step 2: Access Components Dialog Next, you need to register the ActiveX control within your development environment: 1. In the menu, navigate to "Project" 2. Select "Components" to open the components dialog ![Opening the Components dialog](vevb6_2.webp) ### Step 3: Select the TVFVideoEdit Control From the Components dialog: 1. Scroll through the available controls 2. Locate and check the box for "VisioForge Video Edit Control" 3. Click "OK" to confirm your selection ![Selecting the Video Edit Control component](vevb6_3.webp) ### Step 4: Verify Control Registration After successful registration: 1. The TVFVideoEdit control icon appears in your toolbox 2. This confirms the control is ready for use in your application ![Control added to toolbox](vevb6_4.webp) ![Control icon in toolbox](vevb6_41.webp) ### Step 5: Implement the Control To begin using the control in your application: 1. Select the TVFVideoEdit control from the toolbox 2. Click and drag on your form to place an instance of the control 3. Size the control appropriately for your interface 4. Access properties and methods through the Properties window and code ## Advanced Implementation Tips * Set appropriate control properties before loading media files * Handle events for user interaction and processing notifications * Consider memory management when working with large video files * Test your application thoroughly with various media formats --- For technical questions or implementation challenges, contact our [support team](https://support.visioforge.com/). Access additional code examples and resources on our [GitHub repository](https://github.com/visioforge/). ---END OF PAGE--- # Local File: .\delphi\videoedit\install\visual-studio.md --- title: TVFVideoEdit Integration Guide for Visual Studio 2010+ description: Complete step-by-step tutorial for installing and configuring TVFVideoEdit ActiveX components in Visual Studio 2010 and later versions. Learn how to set up video editing capabilities in your C#, C++, or VB.NET projects. sidebar_label: Visual Studio 2010 and later --- # Installing TVFVideoEdit in Visual Studio 2010 and Later Versions ## Overview > Related products: [All-in-One Media Framework (Delphi / ActiveX)](https://www.visioforge.com/all-in-one-media-framework) TVFVideoEdit provides powerful video editing capabilities through ActiveX controls that integrate smoothly with various development environments. This guide walks you through the installation process specifically for Visual Studio 2010 and later versions. ## Compatibility Information The ActiveX control can be used directly in C++ projects without additional wrappers. For C# or VB.Net development, Visual Studio automatically creates a custom wrapper assembly that enables the ActiveX API in managed code environments. ## Prerequisites Before beginning the installation process, ensure you have: - Visual Studio 2010 or later installed on your development machine - Administrative privileges (required for ActiveX registration) - Both x86 and x64 ActiveX controls registered (Visual Studio might use x86 for the UI designer even when targeting x64) ## Step-by-Step Installation Guide ### Creating a New Project 1. Start Visual Studio and create a new project using C++, C#, or Visual Basic. 2. For this demonstration, we'll use a C# Windows Forms application, but the process applies similarly to VB.Net and C++ MFC projects. ![New project creation screen](vevs2003_1.webp) ### Adding the ActiveX Control to Your Toolbox 1. Right-click on the Toolbox panel in Visual Studio 2. Select the "Choose Items" option from the context menu that appears ![Opening Choose Items dialog](vevs2003_2.webp) ### Selecting the Video Edit Control 1. In the Choose Toolbox Items dialog, locate the COM Components tab 2. Browse through the list or use the search functionality 3. Find and select the "VisioForge Video Edit Control" item 4. Click OK to add the control to your toolbox ![Selecting the Video Edit Control](vevs2003_3.webp) ### Implementing the Control in Your Form 1. Locate the newly added control in your toolbox 2. Click and drag it onto your form design surface 3. The control is now ready for implementation in your application ![Adding control to the form](vevs2003_4.webp) ## Advanced Integration Options ### .NET Development Recommendations For developers working with .NET applications, we strongly recommend considering the native [.NET SDK](https://www.visioforge.com/video-edit-sdk-net) as an alternative to ActiveX integration. The .NET SDK offers several advantages: - Enhanced performance and stability - Native support for WinForms, WPF, and MAUI controls - Broader feature set and API capabilities - Simpler integration with modern development practices ## Troubleshooting Common Issues When integrating TVFVideoEdit, you might encounter these common challenges: - Registration issues: Ensure you have administrative privileges - Architecture mismatches: Verify both x86 and x64 versions are properly registered - Reference errors: Check that all required dependencies are included in your project ## Additional Resources If you encounter any difficulties following this tutorial or need specialized assistance with your implementation, our development team is available to provide technical guidance. - Access additional code samples on our [GitHub repository](https://github.com/visioforge/) - Contact our [technical support team](https://support.visioforge.com/) for personalized assistance ---END OF PAGE--- # Local File: .\directshow\how-to-register.md --- title: DirectShow Filter SDK Registration Guide description: Complete guide to registering DirectShow filters and SDKs in multiple programming languages. Learn implementation techniques for C++, C#, and Delphi with code examples and alternative registration methods. sidebar_label: DirectShow Filter SDK Registration Guide --- # DirectShow Filter and SDK Registration Guide DirectShow filters and SDK components often require proper registration to function correctly within your applications. This guide provides detailed implementation methods for registering DirectShow filters across multiple programming languages. ## Registration Overview Most DirectShow filters in the SDK can be registered using the IVFRegister interface. This standardized approach works consistently across development environments. However, some specialized filters (like RGB2YUV converters) are designed to work without explicit registration. ## Registration Methods by Language ### C++ Implementation The following C++ code demonstrates how to access the registration interface: ```cpp // {59E82754-B531-4A8E-A94D-57C75F01DA30} DEFINE_GUID(IID_IVFRegister, 0x59E82754, 0xB531, 0x4A8E, 0xA9, 0x4D, 0x57, 0xC7, 0x5F, 0x01, 0xDA, 0x30); /// /// Filter registration interface. /// DECLARE_INTERFACE_(IVFRegister, IUnknown) { /// /// Sets registered. /// /// /// License Key. /// STDMETHOD(SetLicenseKey) (THIS_ WCHAR * licenseKey )PURE; }; ``` ### C# Implementation For .NET developers, the registration interface can be imported using the following C# code: ```cs /// /// Public filter registration interface. /// [ComImport] [System.Security.SuppressUnmanagedCodeSecurity] [Guid("59E82754-B531-4A8E-A94D-57C75F01DA30")] [InterfaceType(ComInterfaceType.InterfaceIsIUnknown)] public interface IVFRegister { /// /// Sets registered. /// /// /// License Key. /// [PreserveSig] void SetLicenseKey([In, MarshalAs(UnmanagedType.LPWStr)] string licenseKey); } ``` ### Delphi Implementation For Delphi developers, implement the registration interface as follows: ```pascal const IID_IVFRegister: TGUID = '{59E82754-B531-4A8E-A94D-57C75F01DA30}'; type /// /// Public filter registration interface. /// IVFRegister = interface(IUnknown) /// /// Sets registered. /// /// /// License Key. /// procedure SetLicenseKey(licenseKey: PWideChar); stdcall; end; ``` ## Alternative Registration Approaches Beyond the IVFRegister interface, several other registration methods are available: ### System Registry Registration DirectShow filters can be registered directly in the Windows registry using appropriate registry keys. This approach is particularly useful for system-wide filter availability. ### Custom Build Integration For specialized deployment scenarios, custom build processes can automate the registration of DirectShow filters during application installation or initialization. ### COM Registration Standard COM registration techniques can also be applied to DirectShow filters, leveraging tools like regsvr32 for DLL-based filters. ## Best Practices for Filter Registration When implementing DirectShow filter registration: 1. Consider application permission requirements 2. Handle registration failures gracefully 3. Implement unregistration logic for clean application removal 4. Test registration under various user permission scenarios ---END OF PAGE--- # Local File: .\directshow\index.md --- title: DirectShow SDKs & Filters for Video Processing description: Explore our complete collection of DirectShow filters and SDKs for professional video playback, processing, encoding, and application development. Create powerful multimedia applications with our developer tools. sidebar_label: DirectShow SDKs and filters order: 19 icon: ../static/directshow.svg route: /docs/directshow/ --- # DirectShow SDKs and Filters for Video Development ## Introduction to DirectShow Technology DirectShow technology enables developers to create robust multimedia applications for capturing, processing, and playing video content. Our comprehensive suite of DirectShow filters and SDKs provides the essential building blocks for developing sophisticated video processing and playback applications with minimal effort. ## Playback and Decoding Solutions ### High-Performance Source Filters Our powerful source filters enable seamless playback of diverse video formats in your custom applications. Built upon industry-standard libraries, these filters ensure maximum compatibility and performance. #### VLC Source DirectShow Filter The VLC Source filter delivers exceptional playback capabilities for numerous media formats, leveraging the versatile VLC media framework for optimal performance and format support. - [Explore VLC Source DirectShow Filter](vlc-source-filter/index.md) #### FFMPEG Source DirectShow Filter Our FFMPEG-based filters provide unparalleled media format compatibility, utilizing the widely-adopted FFMPEG libraries to handle virtually any video or audio format in your applications. - [Discover FFMPEG Source DirectShow Filter](ffmpeg-source-filters/index.md) ## Advanced Encoding Solutions ### Professional Encoding Filters Our encoding filters enable developers to implement high-quality video and audio encoding directly within applications. The suite supports numerous industry-standard codecs for maximum flexibility. - [Browse the Complete Encoding Filters Pack](filters-enc/index.md) ## Video Processing and Enhancement ### Specialized Processing Filters Transform and enhance video content with our processing filters that enable rotation, scaling, color grading, overlay capabilities, and numerous other visual effects to create professional-quality video output. - [Explore Video Processing Filters Pack](proc-filters/index.md) ## Virtual Camera Implementation ### Virtual Camera Development SDK Create and integrate virtual camera devices into your applications with our specialized SDK. The virtual camera seamlessly interfaces with any DirectShow-compatible application, including popular video conferencing software. - [Learn about Virtual Camera SDK](virtual-camera-sdk/index.md) ## Secure Video Solutions ### Video Content Encryption SDK Implement robust video content protection with our encryption SDK. Secure your video assets while maintaining playback compatibility with standard DirectShow players such as Windows Media Player and MPC-BE. - [Discover Video Encryption SDK](video-encryption-sdk/index.md) ## Implementation Guides ### Technical Documentation and Tutorials Get started quickly with our detailed technical tutorials designed specifically for developers implementing DirectShow components: - [DirectShow Filter and SDK Registration Guide](how-to-register.md) ---END OF PAGE--- # Local File: .\directshow\ffmpeg-source-filters\index.md --- title: FFMPEG Source DirectShow Filter for Multimedia Apps description: Integrate powerful media playback into your applications with our FFMPEG Source DirectShow Filter. Full support for MP4, MKV, AVI, network streams, and seamless integration with Delphi, C++, and .NET projects. sidebar_label: FFMPEG Source DirectShow Filter --- # FFMPEG Source DirectShow Filter ## Introduction The FFMPEG Source DirectShow filter enables developers to seamlessly integrate advanced media decoding and playback capabilities into any DirectShow-compatible application. This powerful component bridges the gap between complex multimedia formats and your software development needs, providing a robust foundation for building media-rich applications. ## Key Features and Capabilities Our filter comes bundled with all necessary FFMPEG DLLs and provides a feature-rich DirectShow filter interface that supports: - **Extensive Format Compatibility**: Handle a wide range of video and audio formats including MP4, MKV, AVI, MOV, WMV, FLV, and many others without additional codec installations - **Network Stream Support**: Connect to RTSP, RTMP, HTTP, UDP, and TCP streams for live media integration - **Multiple Stream Management**: Select between video and audio streams in multi-stream media files - **Advanced Seeking Capabilities**: Implement precise seeking functionality in your applications - **GPU Acceleration**: Utilize hardware acceleration for optimal performance ## Implementation Examples The SDK includes comprehensive sample applications for multiple development environments: ### Delphi Integration (Primary) ```delphi // Initialize the FFMPEG Source filter in Delphi using DSPack procedure TMainForm.InitializeFFMPEGSource; var FFMPEGFilter: IBaseFilter; FileSource: IFileSourceFilter; begin // Create FFMPEG Source filter instance // IMPORTANT: Ensure proper COM initialization before this call CoCreateInstance(CLSID_FFMPEGSource, nil, CLSCTX_INPROC_SERVER, IID_IBaseFilter, FFMPEGFilter); // Query for file source interface FFMPEGFilter.QueryInterface(IID_IFileSourceFilter, FileSource); // Load media file - can be local or network URL FileSource.Load('C:\media\sample.mp4', nil); // Add to filter graph for rendering FilterGraph.AddFilter(FFMPEGFilter, 'FFMPEG Source'); // Connect to appropriate renderers or processing filters // FilterGraph.RenderStream(...); end; ``` ### .NET and C++ Options The SDK also supports .NET applications (using DirectShowNet library) and C++ development environments with equivalent functionality and similar implementation patterns. ### C++ Integration Example ```cpp // Initialize the FFMPEG Source filter in C++ using DirectShow HRESULT InitializeFFMPEGSource() { HRESULT hr = S_OK; IGraphBuilder* pGraph = NULL; IMediaControl* pControl = NULL; IBaseFilter* pFFMPEGSource = NULL; IFileSourceFilter* pFileSource = NULL; // Initialize COM CoInitialize(NULL); // Create the filter graph manager hr = CoCreateInstance(CLSID_FilterGraph, NULL, CLSCTX_INPROC_SERVER, IID_IGraphBuilder, (void**)&pGraph); if (FAILED(hr)) return hr; // Create the FFMPEG Source filter hr = CoCreateInstance(CLSID_FFMPEGSource, NULL, CLSCTX_INPROC_SERVER, IID_IBaseFilter, (void**)&pFFMPEGSource); if (FAILED(hr)) goto cleanup; // Add the filter to the graph hr = pGraph->AddFilter(pFFMPEGSource, L"FFMPEG Source"); if (FAILED(hr)) goto cleanup; // Get the IFileSourceFilter interface hr = pFFMPEGSource->QueryInterface(IID_IFileSourceFilter, (void**)&pFileSource); if (FAILED(hr)) goto cleanup; // Load the media file hr = pFileSource->Load(L"C:\\media\\sample.mp4", NULL); if (FAILED(hr)) goto cleanup; // Render the output pins of the FFMPEG Source filter hr = pGraph->Render(GetPin(pFFMPEGSource, PINDIR_OUTPUT, 0)); // Get the media control interface for playback control hr = pGraph->QueryInterface(IID_IMediaControl, (void**)&pControl); if (SUCCEEDED(hr)) { // Start playback hr = pControl->Run(); // ... handle playback as needed } cleanup: // Release interfaces if (pControl) pControl->Release(); if (pFileSource) pFileSource->Release(); if (pFFMPEGSource) pFFMPEGSource->Release(); if (pGraph) pGraph->Release(); return hr; } // Helper function to get pins from a filter IPin* GetPin(IBaseFilter* pFilter, PIN_DIRECTION PinDir, int nPin) { IEnumPins* pEnum = NULL; IPin* pPin = NULL; if (pFilter) { pFilter->EnumPins(&pEnum); if (pEnum) { while (pEnum->Next(1, &pPin, NULL) == S_OK) { PIN_DIRECTION PinDirThis; pPin->QueryDirection(&PinDirThis); if (PinDir == PinDirThis) { if (nPin == 0) break; nPin--; } pPin->Release(); pPin = NULL; } pEnum->Release(); } } return pPin; } ``` ## Integration with Processing Filters Enhance your media pipeline by connecting the FFMPEG Source filter with additional processing components: - Apply real-time video effects and transformations - Process audio streams for custom sound manipulation - Implement specialized media analysis features Our [Processing Filters pack](https://www.visioforge.com/processing-filters-pack) offers additional capabilities, or you can integrate with any standard DirectShow-compatible filters. ## Technical Specifications ### Supported DirectShow Interfaces The filter implements these standard DirectShow interfaces for maximum compatibility: - **IAMStreamSelect**: Select between multiple video and audio streams - **IAMStreamConfig**: Configure video and audio settings - **IFileSourceFilter**: Set filename or streaming URL - **IMediaSeeking**: Implement precise seeking functionality - **ISpecifyPropertyPages**: Access configuration through property pages ## Version History and Updates ### Version 15.0 - Enhanced FFMPEG libraries with latest codecs - Added GPU decoding support for improved performance - Optimized memory management for large files ### Version 12.0 - Updated FFMPEG libraries - Improved compatibility with Windows 10/11 ### Version 11.0 - Updated FFMPEG libraries - Fixed seeking issues with certain file formats ### Version 10.0 - Updated FFMPEG libraries - Added support for additional container formats ### Version 9.0 - Updated FFMPEG libraries - Performance optimizations ### Version 8.0 - Updated FFMPEG libraries - Improved error handling ### Version 7.0 - Initial release as an independent product - Core functionality established ## Additional Resources - Explore our [product page](https://www.visioforge.com/ffmpeg-source-directshow-filter) for detailed specifications - View our [End User License Agreement](../../eula.md) for licensing details - Check our developer documentation for advanced implementation scenarios ---END OF PAGE--- # Local File: .\directshow\filters-enc\index.md --- title: DirectShow Encoding Filters Pack for Developers description: Advanced DirectShow encoding filters for professional media application development. Integrate high-performance audio and video encoding capabilities with GPU acceleration support for multiple formats including MP4, HEVC, MKV, WebM and more. sidebar_label: DirectShow Encoding Filters Pack --- # DirectShow Encoding Filters Pack ## Introduction The DirectShow Encoding Filters Pack provides a powerful set of media encoding components designed specifically for software developers building professional multimedia applications. This toolkit enables seamless integration of high-performance encoding capabilities for both audio and video streams across a wide variety of popular formats. ## Key Features ### Multi-Format Encoding Support The filters pack supports numerous industry-standard formats, including: - **MP4 container** with H264, HEVC, and AAC codecs - **MPEG-TS** streams - **MKV** (Matroska) containers - **WebM** format with VP8/VP9 video codecs - Multiple audio formats including **Vorbis**, **MP3**, **FLAC**, and **Opus** ### Hardware Acceleration Developers can leverage GPU acceleration for improved encoding performance: - **Intel** QuickSync technology - **AMD/ATI** hardware acceleration - **Nvidia** NVENC encoding support This hardware optimization dramatically improves encoding speeds while reducing CPU load in your applications. ### Flexible Implementation Options The pack includes: - Standalone H264/AAC encoders utilizing CPU resources - Specialized muxer components with integrated video and audio encoders - Options for both CPU and GPU-based encoding paths ## Technical Capabilities The filter components integrate seamlessly into DirectShow application pipelines, providing developers with: - High-quality video encoding at various bitrates and resolutions - Efficient audio compression with configurable quality settings - Advanced container format support with customizable parameters - DirectShow filter graph compatibility for straightforward implementation For detailed specifications and a comprehensive list of all supported video/audio encoders and output formats, please visit the [product page](https://www.visioforge.com/encoding-filters-pack). ## Version History ### 11.4 Release - Updated filter components to match current .Net SDK implementations - Enhanced AMD AMF H264/H265 encoders with latest optimizations - Improved Intel QuickSync H265 encoders for better performance - Refreshed sample applications with new coding examples ### 11.0 Release - Synchronized filters with current .Net SDK versions - Upgraded Nvidia NVENC H264/H265 encoders for better quality - Introduced new SSF muxer filter component ### 10.0 Release - Updated all filters to align with .Net SDK implementations - Enhanced Media Foundation encoders (H264, H265, AAC) - Added dedicated NVENC video encoder filter as CUDA encoder replacement ### 9.0 Release - Optimized MP4 container with H264/AAC output - Expanded WebM format support with VP9 encoding capabilities - Improved H265 encoder filter performance - Enhanced Intel QuickSync H264 encoders ### 8.6 Release - Implemented RTSP sink filter for streaming applications - Added RTMP sink filter in BETA status - Upgraded AAC encoder filter with quality improvements ### 8.5 Initial Release - First public release including filters from .Net SDKs - Core components: AAC encoder, H264 encoders (CPU/GPU) - Additional encoders: H265 (CPU/GPU), VP8, Vorbis - Container support: MP4 muxer, WebM muxer - Streaming capabilities: RTSP source, RTMP source ---END OF PAGE--- # Local File: .\directshow\proc-filters\index.md --- title: DirectShow Processing Filters for Media Applications description: Advanced DirectShow filters for professional media manipulation in Windows applications. Transform your video and audio content with high-performance effects, mixing, overlays, and specialized processing tools for developers. sidebar_label: Processing Filters Pack --- # DirectShow Processing Filters for Media Applications ## Introduction to DirectShow Processing Filters The DirectShow Processing Filters Pack delivers a powerful collection of specialized filters built for sophisticated audio and video manipulation in Windows applications. These filters enable developers to implement professional-grade media processing capabilities without developing complex algorithms from scratch. Designed for developers seeking to enhance their applications with advanced media functionality, this toolkit offers a streamlined approach to implementing robust audio-visual features with minimal code overhead. ## Key Capabilities and Benefits ### Video Processing Capabilities #### Advanced Visual Effects - **Dynamic Effects Processing**: Apply real-time effects to video streams including blur, sharpen, sepia, grayscale, and numerous artistic filters - **Custom Effect Chaining**: Combine multiple effects sequentially for complex visual transformations - **Adjustable Parameters**: Fine-tune effect intensity and characteristics for precise control #### Professional Video Mixing - **Multi-Source Blending**: Seamlessly combine multiple video streams into a unified output - **Transition Effects**: Implement smooth transitions between video sources - **Picture-in-Picture**: Create overlay configurations with customizable positioning and scaling #### Image and Text Overlay System - **Dynamic Text Rendering**: Overlay customizable text with font control and animation - **Image Integration**: Add logos, watermarks, and informational graphics to video content - **Alpha Channel Support**: Maintain transparency information for professional compositing #### High-Quality Resize Functionality - **Multiple Algorithms**: Choose from nearest neighbor, bilinear, bicubic, and Lanczos scaling - **Aspect Ratio Control**: Maintain or adjust aspect ratios as needed - **Resolution Optimization**: Scale content for specific output requirements while preserving quality #### Video Manipulation Tools - **Rotation and Cropping**: Adjust video orientation and framing with precise control - **Deinterlacing Options**: Multiple modes available for converting interlaced content - **Noise Reduction**: Advanced algorithms for improving video clarity and quality ### Audio Processing Capabilities #### Audio Enhancement Suite - **Effect Processing**: Apply various audio effects for sound enhancement and creative manipulation - **Channel Management**: Control stereo imaging and multi-channel configurations #### Advanced Audio Controls - **Volume Optimization**: Precise volume adjustment with normalization options - **Balance Adjustment**: Fine-tune left/right channel balance for optimal sound distribution - **Pitch Modification**: Alter pitch while maintaining or changing tempo - **Delay Implementation**: Add customizable delay effects with feedback control #### Professional Sound Effects - **Echo Generation**: Create spatial echo effects with adjustable parameters - **Equalizer System**: Multi-band equalization for frequency adjustment - **Chorus Effects**: Add richness and depth to audio streams - **Flanger Processing**: Create sweeping, psychedelic audio effects ## System Requirements ### Compatible Operating Systems - Windows 11, 10, 8.1, 8, and 7 (both 32-bit and 64-bit versions) ### Development Environment Support - **Microsoft Visual Studio**: Versions 2022, 2019, 2017, 2015, 2013, 2012, and 2010 - **Embarcadero Tools**: Compatible with Delphi and C++ Builder - **Additional Environments**: Works with any development platform supporting DirectShow filters ### Technical Prerequisites - DirectX 9 or later installation - Minimum 4GB RAM (8GB+ recommended for high-resolution processing) - Multi-core processor recommended for optimal performance ## Additional Resources - [Complete Product Information](https://www.visioforge.com/processing-filters-pack) - [API Documentation](https://api.visioforge.com/proc_filters/api/index.html) - [Licensing Information](../../eula.md) ## Version History and Updates ### Version 15.1 Enhancements - Integration with .Net SDKs 15.1 architecture - Significant improvements to audio and video mixing engines - Enhanced multithreading support for better performance on multi-core systems - Expanded video effects library with new processing options - Resolution of audio click artifacts in mixer component - Optimized support for ultra-high-definition 4K and 8K content processing ### Version 15.0 Improvements - Full alignment with .Net SDKs 15.0 framework - Optimized high-resolution processing for brightness, contrast, saturation, and hue filters ### Version 14.0 Updates - Complete compatibility with .Net SDKs 14.0 - Performance optimization for video resize operations - Enhanced bicubic video resize algorithm for superior quality ### Version 12.0 Refinements - Integration with .Net SDKs 12.0 infrastructure - Redesigned audio mixer with improved performance - Fixed stability issues when using crop or resize with incorrect parameters ### Version 11.0 Features - Updated to match .Net SDKs 11.0 specifications - Improved audio tempo and pitch manipulation algorithms - Optimized video balance performance for smoother processing ### Version 10.0 Developments - Alignment with .Net SDKs 10.0 architecture - Completely revamped Video Mixer component ### Version 9.0 Advancements - Integration with .Net SDKs 9.2 framework - Enhanced video effects library - Specific optimizations for 4K content processing ### Version 8.5 Initial Release - First public release, featuring filters from .Net SDKs 8.5 - Introduction of Lanczos support in video resize filter for superior quality scaling ---END OF PAGE--- # Local File: .\directshow\video-encryption-sdk\index.md --- title: Advanced Video Encryption SDK for Developers description: Integrate powerful video encryption capabilities into your DirectShow applications. Securely encrypt video files or streams with AES-256, support H264/AAC formats, and leverage GPU acceleration for optimal performance. Complete developer toolkit with code samples. sidebar_label: Video Encryption SDK --- # Video Encryption SDK ## Introduction to Video Encryption The [Video Encryption SDK](https://www.visioforge.com/video-encryption-sdk) provides robust tools for encoding video files into MP4 H264/AAC format with advanced encryption capabilities. Developers can secure their media content using custom passwords or binary data encryption methods. The SDK integrates seamlessly with any DirectShow application through a complete set of filters. These filters come with extensive interfaces allowing developers to fine-tune settings according to specific security requirements and implementation needs. ## Integration Flexibility You can implement the SDK in various DirectShow applications as filters for both encryption and decryption processes. The system works effectively with: - Live video sources - File-based video sources - Software video encoders - GPU-accelerated video encoders from the [DirectShow Encoding Filters pack](https://www.visioforge.com/encoding-filters-pack) (available separately) - Third-party DirectShow filters for additional video encoding options ## Key Features and Capabilities ### Core Functionality - **Secure Encryption/Decryption**: Process video files or capture streams with robust security algorithms - **Format Support**: Full H264 encoder support for video content - **Audio Handling**: Complete AAC encoder support for audio streams - **Flexible Security Options**: Implement encryption using either binary data or string passwords ### Performance Optimization - AES-256 encryption engine for maximum security - CPU hardware acceleration support - GPU acceleration compatibility - Optimized for high-speed encryption processes ## Development Resources ### Code Samples and Documentation The SDK includes comprehensive code samples for multiple programming languages: - C# implementation examples - C++ reference code - Delphi sample projects These samples provide practical implementation guidance for developers building secure video applications. ### Demo Application Explore the included Video Encryptor application for a hands-on demonstration of the SDK's capabilities in a working environment. ## Licensing Information - [End User License Agreement](../../eula.md) ## Version History ### Version 11.4 - Full compatibility with VisioForge .Net SDKs 11.4 - Enhanced Nvidia NVENC support for H264 and H265 video encoders - Improved Intel QuickSync support for H264 video encoder - Added NV12 colorspace support for enhanced performance ### Version 11.0 - Complete compatibility with VisioForge .Net SDKs 11.0 - Enhanced GPU encoders support - Upgraded AAC encoder functionality ### Version 10.0 - Full compatibility with VisioForge .Net SDKs 10.0 - Enhanced compatibility with H264 and H265 video formats - Integrated AMD AMF acceleration support - Added Intel QuickSync technology support ### Version 9.0 - Significantly improved encryption processing speed - Added CPU hardware acceleration capabilities - Implemented new engine based on AES-256 encryption - Added file usage as a key (with binary array support) - Integrated NVENC support for GPU acceleration - Enhanced AAC HE encoder support ### Version 8.0 - Updated video and audio encoders - Improved filter encryption performance ### Version 7.0 - Initial release as a standalone product - Previously integrated within Video Capture SDK, Video Edit SDK, and Media Player SDK - Compatible with any DirectShow application without requiring additional VisioForge SDKs ---END OF PAGE--- # Local File: .\directshow\virtual-camera-sdk\index.md --- title: DirectShow Virtual Camera SDK for Video Streaming description: Learn how to implement professional virtual camera functionality in your applications with our powerful DirectShow-based SDK. Stream video from any source to virtual camera devices for use in Skype, Zoom, Teams, and web browsers with full audio support. sidebar_label: Virtual Camera SDK --- # DirectShow Virtual Camera SDK ## Overview Our robust DirectShow-based Virtual Camera SDK enables developers to implement powerful virtual camera functionality in their applications. The SDK provides sink filters that can be utilized as output in Video Capture SDK or Video Edit SDK environments, while the source filters can be employed as video sources for various capture applications. With this versatile toolkit, you can stream video content from virtually any source directly to a virtual camera device. These virtual devices are fully compatible with popular communication platforms such as `Skype`, `Zoom`, `Microsoft Teams`, web browsers, and numerous other applications that support DirectShow virtual cameras. The SDK also includes comprehensive audio streaming capabilities for complete multimedia solutions. To help you get started quickly, the SDK package includes a fully-functional sample application that demonstrates how to stream video content from files to virtual camera devices. Download the SDK from our [product page](https://www.visioforge.com/virtual-camera-sdk) to start integrating virtual camera functionality into your applications today. ## Key Features and Capabilities * **Multiple Source Support**: Stream video to virtual camera from files, network streams, or capture devices * **Architecture Compatibility**: Full x86/x64 architecture support * **High-Resolution Support**: Stream video content up to 4K resolution * **Customization Options**: Define and implement custom camera names * **SDK Integration**: Seamless integration with other development tools * **Audio Support**: Complete audio streaming capabilities * **Professional Applications**: Perfect for teleconferencing, streaming, and professional video applications ## Technical Implementation ### Sample DirectShow Graph Architecture The diagram below illustrates the standard DirectShow graph implementation when using the Virtual Camera SDK: ![Sample DirectShow graph](demo.png) ### License Registration via Registry You can register the filter with your valid license key using the Windows registry system. Configure licensing using the following registry key: ```reg HKEY_LOCAL_MACHINE\SOFTWARE\VisioForge\Virtual Camera SDK\License ``` Set your purchased license key as a string value in this registry location. ### Deployment Guidelines For proper deployment, copy and COM-register the SDK DirectShow filters - these are the files in the `Redist` folder with .ax extension. Registration can be performed using `regsvr32.exe` or through COM registration in your application installer. Please note that administrative privileges are required for successful registration. ### No-Signal Application Configuration You can configure an application to run automatically when the virtual camera is not connected to any video source. Configure the no-signal application using this registry key: ```reg HKEY_LOCAL_MACHINE\SOFTWARE\VisioForge\Virtual Camera SDK\StartupEXE ``` Set the executable file name as a string value. ### No-Signal Image Configuration Instead of displaying a black screen when no video source is available, you can configure a custom image to be shown. Configure the no-signal image using this registry key: ```reg HKEY_LOCAL_MACHINE\SOFTWARE\VisioForge\Virtual Camera SDK\BackgroundImage ``` Set the image file path as a string value. ## Third-Party Libraries and Integration The Virtual Camera SDK contains third-party components that are used in the demo applications. These components are not required for the core SDK functionality. The Delphi and .NET demonstration applications utilize third-party libraries to simplify DirectShow development. The C++ demo applications are built without external dependencies. ### .NET Integration .NET applications leverage [DirectShowLib.Net (LGPL)](http://directshownet.sourceforge.net) to implement DirectShow functionality in managed code environments. Developers can create console applications, WinForms, or WPF applications using .NET. The included demo applications utilize WinForms for the user interface. ### Delphi Integration Delphi applications use [DSPack (MPL)](http://code.google.com/p/dspack/) to implement DirectShow functionality. While modern Delphi versions include built-in DirectShow support, DSPack is utilized in the demo applications to maintain compatibility with older Delphi versions. ### C++ Integration The C++ demo applications do not require third-party libraries and are built using the standard DirectShow SDK. The DirectShow SDK can be obtained from the [Microsoft website](https://www.microsoft.com/en-us/download/details.aspx?id=8279). Developers can utilize MFC, ATL, or other C++ frameworks to build their applications. The included demo applications are built with MFC. ## System Requirements The SDK is compatible with the following Microsoft Windows operating systems: * Windows 7, 8, 8.1, 10, and 11 * Windows Server 2008, 2012, 2016, 2019, and 2022 ## Version History and Updates ### Version 14.0 * Performance optimizations and enhancements * Improved Windows 11 compatibility * Enhanced support for modern web browsers * Minor updates and bug fixes ### Version 12.0 * Windows 10 support improvements * Performance enhancements * 8K resolution support added * Improved Mozilla Firefox and Microsoft Edge compatibility * Various minor updates ### Version 11.0 * Critical bug fixes implemented * Updated Google Chrome compatibility * Resolved audio clicks issues in various web browsers and applications ### Version 10.0 * High frame rate support added * Significant performance improvements * Minor updates and bug fixes ### Version 9.0 * 4K video resolution support added * Updated support for contemporary web browsers * Various minor updates and improvements ### Version 8.0 * Added background image functionality for no-signal scenarios * Implemented application auto-run for no-signal conditions * Enhanced Skype compatibility ### Version 7.1 * Audio streaming support via virtual audio output and virtual microphone input * PCM audio format support with customizable sample rates and channel configuration * Bug fixes and performance improvements * Additional video resolutions added ### Version 7.0 * Initial release as a standalone product * Previously included in Video Edit SDK and Video Capture SDK * Compatible with any DirectShow application ## Additional Resources * [End User License Agreement](../../eula.md) ---END OF PAGE--- # Local File: .\directshow\vlc-source-filter\index.md --- title: VLC Source DirectShow Filter for Media Playback description: Integrate powerful VLC media capabilities into DirectShow applications with our robust filter component. Enable playback of diverse video files and network streams with hardware acceleration, 4K support, and advanced seeking capabilities. sidebar_label: VLC Source DirectShow Filter --- # VLC Source DirectShow Filter ## Overview The VLC Source DirectShow filter empowers developers to seamlessly integrate advanced media playback capabilities into any DirectShow-based application. This powerful component enables smooth playback of various video files and network streams across multiple formats and protocols. Our SDK package delivers a complete solution with all necessary VLC player DLLs bundled alongside a flexible DirectShow filter. The package provides both standard file-selection interfaces and extensive options for custom filter configurations to match your specific development requirements. For complete product details and licensing options, visit the [product page](https://www.visioforge.com/vlc-source-directshow-filter). ## Technical Specifications ### Supported DirectShow Interfaces The filter implements these standard DirectShow interfaces for maximum compatibility: - **IAMStreamSelect** - Comprehensive video and audio stream selection capabilities - **IAMStreamConfig** - Advanced video and audio configuration settings - **IFileSourceFilter** - Flexible specification of filename or URL sources - **IMediaSeeking** - Robust timeline seeking and positioning support ### Key Features - Hardware-accelerated decoding for optimal performance - Support for 4K and 8K video playback - Extensive format compatibility including modern codecs - Network stream handling (RTSP, HLS, DASH, etc.) - Subtitle rendering and management - Multi-language audio track support - 360° video playback capabilities - HDR content support ## Version History ### Version 15.0 - Enhanced playback quality across numerous formats - Improved subtitle rendering engine - Updated codec implementations including dav1d, ffmpeg, and libvpx - Added Super Resolution scaling with nVidia and Intel GPU acceleration ### Version 14.0 - Updated to VLC v3.0.18 core - Fixed DxVA/D3D11 compatibility issues with HEVC content - Resolved OpenGL resizing problems for smoother playback ### Version 12.0 - Upgraded to VLC v3.0.16 engine - Added support for new Fourcc formats (E-AC3 and AV1) - Fixed stability issues with VP9 streams ### Version 11.1 - Incorporated VLC v3.0.11 - Optimized HLS playlist update mechanism - Enhanced WebVTT subtitle handling and display ### Version 11.0 - Built on VLC v3.0.10 foundation - Fixed critical regression issues with HLS streams ### Version 10.4 - Major update to VLC 3.0 architecture - Enabled hardware decoding by default for 4K and 8K content - Added 10-bit color depth and HDR support - Implemented 360-degree video and 3D audio capabilities - Introduced Blu-Ray Java menu support ### Version 10.0 - Initial release as a standalone DirectShow filter - For earlier version history, please refer to Video Capture SDK .Net changelog ## Additional Resources - [End User License Agreement](../../eula.md) - [Code Samples](https://github.com/visioforge/) ---END OF PAGE--- # Local File: .\dotnet\changelog.md --- title: .Net SDKs Updates and Release History description: Detailed changelog for .Net video processing SDKs, including Video Capture, Media Player, Video Edit and Media Blocks. Track latest features, improvements, and fixes across versions. Essential reference for developers implementing video solutions. sidebar_label: Changelog hide_table_of_contents: true --- # Changelog Changes and updates for all .Net SDKs. ## 2025.5.1 * [ALL] Update NuGet dependency packages to the latest versions * [X-engines] Resolved issue with RTMP network streaming to a custom server ## 2025.4.8 * [ALL] Added Absolute Move API to the `ONVIFDeviceX` class. You can use this API to move the ONVIF camera to the specified absolute position. ## 2025.2.24 * [X-engine] By default, Media Foundation device enumeration is disabled. You can enable it using the `DeviceEnumerator.Shared.IsEnumerateMediaFoundationDevices` property. ## 2025.2.18 * [Media Player SDK.Net] Added loop support for the cross-platform engine. * [ALL] Updated RTSP-X engine output, fixed crash issue with RTSP output and VLC player frequent reconnects * [X-engines] Changed face detector support to use IFaceDetector interface * [Live Video Compositor] Fixed registration issues with custom video view attached to video input ## 2025.2.9 * [X-engines] Updated NDI connection speed ## 2025.2.4 * [X-engines] RTSP Server Media Block and RTSPServerOutput added to Video Capture SDK. You can use the RTSPServerBlock to create an RTSP server and stream video and audio to it. ## 2025.2.1 * [X-engines] Added NVENC and AMF AV1 encoders support ## 2025.1.25 * [Windows] Resolved HTTPS issue with the not loaded SSL certificates ## 2025.1.22 * [Windows] Resolved issue with missed ONVIF sources while enumerating on PC with multiple network interfaces * [Media Blocks SDK .Net] Added the `OnEOS` event to `MediaBlockPad` class. You can use this event to get the EOS (End of Stream) event from the media block. It can be useful if you have several file sources with a different duration and you need to stop the pipeline when the first source ends. * [Media Blocks SDK .Net] Added the `SendEOS` method to `MediaBlocksPipeline` class. You can use this method to send the EOS (End of Stream) event to the pipeline. ## 2025.1.18 * [NuGet] `VisioForge.Core.UI.Apple`, `VisioForge.Core.UI.Android`, and `VisioForge.Core.UI.WinUI` packages are merged into the `VisioForge.DotNet.Core` package. All namespaces are the same. * [Media Blocks SDK .Net] Added the `ZOrder` property to `LVCVideoInput` and `LVCVideoAudioInput` classes. You can use this property to set the Z-order for the video input. ## 2025.1.14 * [NuGet] `VisioForge.Core.UI.WPF` and `VisioForge.Core.UI.WinForms` packages are merged into the `VisioForge.DotNet.Core` package. In WPF projects you have to update the XAML code if the assembly names are used. All namespaces are the same. ## 2025.1.11 * [Video Capture SDK .Net] Resolved QSV H264 FFMPEG encoder issue with the wrong symbols in parameters ## 2025.1.7 * [Cross-platform] Added `libcamera` source support for Linux/Raspberry Pi. ## 2025.1.5 * [Cross-platform] Improved previous frame playback in Media Player SDK .Net (Cross-platform engine) ## 2025.1.4 * [Cross-platform] Resolved issue with AMD AMF plugin initialization ## 2025.1.1 * [Cross-platform] Resolved memory leak in `OverlayManagerImage` ## 2025.1.0 * [Cross-platform] Updated Live Video Compositor engine. Improved Decklink support for input and output. Improved performance. The new engine classes are located in the `VisioForge.Core.LiveVideoCompositorV2` namespace. ## 2025.0.29 * [Cross-platform] Default video renderer on Windows has been changed to DirectX 11 ## 2025.0.17 * [Media Blocks SDK .Net] Added libCamera source support (can be used on Raspberry Pi) ## 2025.0.16 * [Media Blocks SDK .Net] Resolved issue with adding several AudioRendererBlocks to the pipeline ## 2025.0.14 * [Media Blocks SDK .Net] Added the "PushJPEGSourceSettings" class to configure the JPEG source for the "PushSourceBlock". You can use this class to set the JPEG source settings for the "PushSourceBlock". Also "video-from-images" sample added. ## 2025.0.7 * [ALL] Resolved window capture issues in cross-platform SDKs * [Media Blocks SDK .Net] Added the Bridge Source Switch sample ## 2025.0.5 * [iOS] Resolved issues with playback speed for some video files * [iOS] Added iOS Simulator support for all SDKs. Camera source is not supported in the simulator. ## 2025.0.3 * [MacOS] Resolved wrong stride issue for vertical camera videos on MacOS * [Video Capture SDK .Net] Resolved background color issue for the scrolling text overlay ## 2025.0 * [ALL] .Net 9 support * [Media Blocks SDK .Net] Added `AVIOutputBlock` to save video and audio streams to the AVI file format * [Media Blocks SDK .Net] `TeeBlock` constructor now accepts the media type as a parameter * [Video Capture SDK .Net] Added `Video_CaptureDevice_SetDefault` and `Audio_CaptureDevice_SetDefault` methods to the `VideoCaptureCore` class. You can use this method to set the default video and audio capture devices * [Cross-platform] Improved `Metal` video rendering performance on Apple devices * [All] Improved performance of common video processing operations in Windows classic SDKs * [CV] Added DNN face detectors for the `Media Blocks SDK .Net` and `Video Capture SDK .Net` * [Mobile] Improved AOT compatibility for iOS and Android * [WinUI] Improved performance of the `WinUI` video rendering * [Media Blocks SDK .Net] Added the `GetLastFrameAsSKBitmap` and `GetLastFrameAsBitmap` methods to `VideoSampleGrabberBlock` to get the last frame as a `SkiaSharp.SKBitmap` or `System.Drawing.Bitmap` * [Video Capture SDK .Net] `VideoCaptureCore`: Added the `AddFakeAudioSource` property to `FFMPEGEXEOutput`. The `Network_Streaming_Audio_Enabled` property of `VideoCaptureCore` should be set to false to use this fake audio. * [ALL] Improved WinUI (and MAUI on Windows) VideoView performance * [Video Capture SDK .Net] `VideoCaptureCore`: Added the `PIP_Video_CaptureDevice_CameraControl_` API to control the camera settings for the Picture-in-Picture mode * [X-engines] Added the headers support for the HTTP sources created using the `HTTPSourceSettings` class * [X-engines] Updated Avalonia samples, with projects for macOS, Linux, and Windows * [X-engines] Added NuGet redist packages for macOS and MacCatalyst (including MAUI) * [Video Capture SDK .Net] `VideoCaptureCore`: Added device path support for `PIP_Video_CaptureDevice_CameraControl` API * [Video Capture SDK .Net] `VideoCaptureCore`: Added the `FFMPEG_MaxLoadTimeout` property for IP camera sources. It allows you to set the maximum time to wait for the FFMPEG source to load the stream * [X-engines] Updated Linux support for `ALSA`, `PulseAudio` and `PipeWire` audio devices * [X-engines] Updated Linux support for `V4L2` devices * [X-engines] Avalonia samples has be changed to a modern 1-project structure * [X-engines] Resolved issue with `MAUI` crashes on Windows after `SkiaSharp` update * [X-engines] Resolved issue with `TextureView` crashes on Android in `MAUI` applications * [X-engines] Resolved playback issue for http sources using the `UniversalSourceBlock` * [X-engines] Added Mobile Streamer sample for Android * [X-engines] Added `OverlayManagerBlock` support for Android (now it's available for all platforms) * [Video Capture SDK .Net] `VideoCaptureCoreX`: Added `CustomVideoProcessor`/`CustomAudioProcessor` properties for all output formats. You can use these properties to set custom video/audio processing blocks for the output format. * [Media Blocks SDK .Net] Added the `KeyFrameDetectorBlock` to detect key frames in video streams (H264, H265, VP8, VP9, AV1, etc.) * [Media Blocks SDK .Net] Fixed licensing issue for the `LiveVideoCompositor` class ## 15.10.0 * [Windows] Updated window capture API to capture only the specified parent window by default. Added the `UpdateHotkey` method to the `WindowCaptureForm` class to update the hotkey for the window capture form. * [X-engines] Better AOT compatibility for default MAUI settings in iOS. * [Media Blocks SDK .Net] Added the `DNNFaceDetectorBlock` to detect faces and blur/pixelate them using OpenCV and DNN models. * [Media Blocks SDK .Net] Added the `MKVOutputBlock` to save video and audio streams to the MKV file format. * [X-engines] Better support for video source size dynamic changing in MAUI applications. * [X-engines] Resolved an issue with two or more VU meters in the same pipeline. * [X-engines] Resolved volume/mute error issue with audio mixer in Live Video Compositor engine. * [X-engines] The `Spinnaker` source for `FLIR`/`Teledyne` cameras is included in the main package and no longer requires an additional plugin. * [Video Capture SDK .Net] Resolved the issue with the `SeparateCapture` API if no `VideoView` was used. * [X-engines] The `MediaBlocksPipeline` constructor no longer has the `live` parameter. For more customizable pipelines, video and audio renderers got the `IsSync` property (`true` by default). * [X-engines] Resolved `VideoViewTX` crash in MAUI Android applications. * [X-engines] `IVideoEncoder` interface added to the `MPEG2VideoEncoder` class. It allows the use of `MPEG2VideoEncoder` with `MPEGTSOutput`, `AVIOutput`, and other output classes. * [X-engines] Resolved the issue with window capture using the `ScreenCaptureD3D11SourceSettings` class. If the rectangle was incorrect or not specified, it caused an error. * [X-engines] `Metal` renderer was added to SDK for Apple devices and used by default for iOS and MAUI. * [Media Blocks SDK .Net] Added the MAUI Screen Capture sample. * [Video Capture SDK .Net] VideoCaptureCore: Added the `VLC_CustomDefaultFrameRate` property to `IPCameraSourceSettings` to set a custom frame rate for the VLC IP camera source if the source does not provide the correct frame rate. * [Media Blocks SDK .Net] `RTSPSourceBlock`: If the RTSP source has audio but you've disabled the audio stream in `RTSPSourceSettings`, SDK will add a null renderer automatically to prevent warnings. * [ALL] Resolved issue with `VideoFrameX.ToBitmap()` call (wrong color space) * [Windows] Updated KLV support in MPEG-TS output * [Windows] Resolved MediaPlayerCore serialization issue * [ALL] Video renderer settings class no longer contains background color. Use the VideoView background color property instead. * [X-engines] Updated GStreamer libraries * [X-engines] Resolved video rendering issues on Android and iOS * [X-engines] iOS crash fixed during VideoViewGL usage * [X-engines] Added default AAC encoder for iOS * [X-engines] iOS camera source update for high frame rate support * [Windows] Updated VLC source - improved file loading speed * [Media Blocks SDK .Net]: Added the `UniversalDemuxBlock` allows to demux video and audio streams from a file in MP4, MKV, AVI, MOV, TS, VOB, FLV, OGG, and WebM formats * [Windows] Resolved FFMPEG stability issues * [X-engines] Resolved issue with loopback audio source using VideoCaptureCoreX and audio capture to file * [X-engines] Added SRT source and sink support in Media Blocks SDK .Net and Video Capture SDK .Net * [Video Capture SDK .Net] VideoCaptureCore: The `IP_Camera_ONVIF_ListSourcesAsyncEx` method got an overload version with a callback for a more responsible UI * [X-engines] RTSP source compatibility update * [X-engines] `Breaking API change`. Starting with this update, the SDK uses `IAudioRendererSettings` interface implementations for audio output configuration. WASAPI output got the custom configuration classes. Output_AudioDevice properties of `VideoCaptureCoreX`/`MediaPlayerCoreX` type have been changed to `IAudioRendererSettings`. You can create the `AudioRendererSettings` class instance from `AudioOutputDeviceInfo` using the default constructor. * [X-engines] Resolved problem with missed Media Foundation sources during device enumeration * [X-engines] Resolved RTSP source problems with audio connection in some situations * [X-engines] Added the RTSP Preview Demo to Media Blocks SDK .Net * [Windows] FFMPEG outputs and source updated to FFMPEG v7.0. * [X-engines] Fixed rare crashes in RTSP source when camera information is not available for some reason (network issue) * [X-engines] Resolved an issue with `WASAPI/WASAPI2` audio renderer usage * [X-engines] Resolved an issue with the audio loopback audio source on Windows * [X-engines] Improved iOS video rendering performance and stability * [X-engines] Added AWS S3 Sink output for Media Blocks SDK .Net * [X-engines] Added Allied Vision USB3/GigE cameras support in Media Blocks SDK .Net and Video Capture SDK .Net ## 15.9 * [X-engines] Resolved wrong aspect ratio with video resize effect/block * [X-engines] Updated GStreamer redist * [X-engines] Added Basler USB3/GigE cameras support in Media Blocks SDK .Net and Video Capture SDK .Net * [Video Edit SDK .Net] VideoEditCoreX: The TextOverlay class changed to use SkiaSharp-based font settings. Additionally, you can set the custom font file name or configure all rendering parameters using custom SKPaint. * [Windows] Added Stream support in `MediaInfoReader`. You can get the video/audio file information from a stream (DB, network, memory, etc.). * [X-engines] Updated Live Video Compositor engine, which improved support of the file sources * [Video Capture SDK .Net] Added camera-covered detector into the `Computer Vision Demo` and the `VisioForge.Core.CV` package * [X-engines] Added API to get snapshots from video files using MediaInfoReaderX: GetFileSnapshotBitmap, GetFileSnapshotSKBitmap, GetFileSnapshotRGB * [X-engines] iOS support in MAUI samples * [X-engines] Resolved memory leak issue for RTSP sources * [Media Player SDK .Net] MediaPlayerCore: Added support for data streams in video files using the FFMPEG source engine. Add the OnDataFrameBuffer event to get data frames (KLV or other) from the video file. * [Video Capture SDK .Net] VideoCaptureCore: Added support for data streams in video files using the IP Capture FFMPEG source engine. Add the OnDataFrameBuffer event to get data frames (KLV or other) from the MPEG-TS UDP network stream or other supported source. * [Video Capture SDK .Net] VideoCaptureCore: Added the FFMPEG_CustomOptions property to the IPCameraSourceSettings class. This property allows you to set custom FFMPEG options for the IP camera source * [Windows] Fixed the hang problem with the FFMPEG source when a network connection is lost * [Media Blocks SDK .Net] Added RTSP MultiView in Sync Demo * [X-engines] Added support for FLIR/Teledyne cameras (USB3Vision/GigE) using the Spinnaker SDK * [Video Edit SDK .Net] VideoEditCoreX: Added support for .Net Stream usage as an input source * The IAsyncDisposable interface was added to all SDK's core classes. The `DisposeAsync` call should be used to dispose of the core objects using async methods. * [Video Capture SDK .Net] VideoCaptureCoreX: Resolved issues with Android video capture (sometimes started only one time) * [Media Blocks SDK .Net] Added HLS streaming sample * [Video Capture SDK .Net] VideoCaptureCore: Resolved crash if the `multiscreen` is enabled and screens added as window's handles (WinForms) * [X-engines] Improved MAUI video rendering speed * [X-engines] Resolved MAUI media playback issues (decoding) in MAUI Android * [X-engines] Resolved an issue with the H264 webcam sources (sometimes not connected) * [X-engines] Resolved an issue with audio stream playback in the Live VideoCompositor engine * [Media Blocks SDK .Net] Resolved a bad audio issue while mixing using the Live Video Compositor engine * [Media Blocks SDK .Net] Added Decklink output and file source into the Live Video Compositor sample * [Media Player SDK .Net] MediaPlayerCore: Added growing MPEG-TS file support for the VLC engine. You can play growing MPEG-TS files while it's recorded ## 15.8 * [X-engines] [API breaking change] DeviceEnumerator can now be used only by using `DeviceEnumerator.Shared` property. One enumerator per app is required. DeviceEnumerator objects used by API have been removed * [X-engines] [API breaking change] Android Activity is not required anymore to create SDK engines * [X-engines] [API breaking change] X-engines require additional initialization and de-initialization steps. To initialize SDK, use the `VisioForge.Core.VisioForgeX.InitSDK()` call. To de-initialize SDK, use the `VisioForge.Core.VisioForgeX.DestroySDK()` call. You need to initialize SDK before any SDK class usage and de-initialize SDK before the application exits. * [Windows] Improved MAUI video rendering performance in Windows * [Windows] Added a mouse highlight for screen capture sources * [Windows] Resolved a CallbackOnCollectedDelegate call issue with the BasicWindow class * [Avalonia] Resolved an issue with Avalonia VideoView resize * [X-engines] Added the StartPosition and StopPosition properties to UniversalSourceSettings. You can use these properties to set the start and stop positions for the file source. * [ALL] Resolved the issue with passwords with special characters used for RTSP sources * [ALL] Resolved the rare video flip issue with the Virtual Camera SDK engine * [ALL] The VisioForge MJPEG Decoder filter was removed from the SDK's NuGet packages. You can optionally add it to your project by file copying or COM registration deployment. * [X-engines] Fixed memory leak in the OverlayManager * [Media Blocks SDK .Net] Resolved issue with the VideoSampleGrabberBlock, SetLastFrame option * [Video Capture SDK .Net] VideoCaptureCoreX: WASAPI and WASAPI2 audio sources can be used now with the VideoCaptureCoreX engine * [X-engines] DeviceEnumerator got events to notify about devices added/removed: OnVideoSourceAdded, OnVideoSourceRemoved, OnAudioSourceAdded, OnAudioSourceRemoved, OnAudioSinkAdded, OnAudioSinkRemoved * [X-engines] Added custom error handler support for MediaBlocks, VideoCaptureCoreX, and MediaPlayerCoreX engines. Use the IMediaBlocksPipelineCustomErrorHandler interface and the SetCustomErrorHandler method to set a custom error handler. * [Video Capture SDK .Net] VideoCaptureCoreX: Resolved issue with incorrect device index error for KS video sources (Windows) * [Video Capture SDK .Net] VideoCaptureCore: Added Virtual_Camera_Output_AlternativeAudioFilterName property to set a custom audio filter for the Virtual Camera SDK output * [Video Edit SDK .Net] VideoEditCore: Added Virtual_Camera_Output_AlternativeAudioFilterName property to set a custom audio filter for the Virtual Camera SDK output * [Media Player SDK .Net] MediaPlayerCore: Added Virtual_Camera_Output_AlternativeAudioFilterName property to set a custom audio filter for the Virtual Camera SDK output * [Video Capture SDK .Net] VideoCaptureCoreX: Added NDI streaming support and sample app. * [Media Blocks SDK .Net] Added the BufferSink block to get video/audio frames from the pipeline * [Media Blocks SDK .Net] Added the CustomMediaBlock class to create custom media blocks for any GStreamer element * [Media Blocks SDK .Net] Added the UpdateChannel method to update the channel of the bridge source or sink * [Media Player SDK .Net] MediaPlayerCore: Updated Tempo effect. * [X-engines] Updated device enumerator. Removed unwanted firewall dialog when listing NDI sources. * [X-engines] Fixed an issue with the video mixer when adding/removing video sources. * [Media Blocks SDK .Net] Added VideoCropBlock and VideoAspectRatioCropBlock blocks to crop video frames. * [Media Blocks SDK .Net] Resolved wrong frame rate issue with VideoRateBlock. * [All] Resolved an issue with the Tempo audio effect. * [Video Capture SDK .Net] VideoCaptureCore: Added WASAPI audio renderer support for the VideoCaptureCore engine. ## 15.7 * [ALL] .Net 8 support * [Video Capture SDK .Net] VideoCaptureCore: Fixed problem with the OnNetworkSourceDisconnect event being called twice. * [X-engines] Added the MPEG-2 video encoder. * [X-engines] Added the MP2 audio encoder. * [X-engines] Resolved Decklink enumeration issues. * [X-engines] Default VP8/VP9 settings changed to live recording. * [X-engines] Added DNxHD video encoder support. * [Video Capture SDK .Net] VideoCaptureCoreX: Fixed problem with audio source format setting (regression). * [Video Capture SDK .Net] VideoCaptureCoreX: Resolved WPF native rendering issue with a pop-up window. * [All] Avalonia 11.0.5 support. * [Video Capture SDK .Net] VideoCaptureCoreX: Resolved licensing issues. * [Video Capture SDK .Net] VideoCaptureCore: Start/StartAsync method will return false if the video capture device is already used by another application. * [All] Updated VLC source (libVLC 3.0.19). * [All] Updated FFMPEG sources and encoders. Resolved issue with missed MSVC dependencies. * [Video Capture SDK] Updated ONVIF engine. * [Cross-platform SDKs] Updated Decklink source. Resolved the issue with the incorrect device name. * [All] SkiaSharp security updates. * [Cross-platform SDKs] Updated Overlay Manager. Added OverlayManagerDateTime class to draw current date time and custom text. * [Cross-platform SDKs] Updated OverlayManagerImage. Resolved issue with System.Drawing.Bitmap usage. * [ALL] VideoCaptureCore: Resolved rare crash issue with WinUI VideoView * [Video Capture SDK .Net] VideoCaptureCore: Updated FFMPEG.exe output. Improved support of x264 and x265 encoders of custom FFMPEG builds. ## 15.6 * [Video Capture SDK .Net] VideoCaptureCore: Improved video crop performance on modern CPUs * [ALL] VideoCaptureCore, MediaPlayerCore, VideoEditCore: Added the static CreateAsync method that can be used instead of the constructor to create engines without UI lag. * [Video Capture SDK .Net] VideoCaptureCore: Resolved issues with video crop. * [Video Capture SDK .Net] VideoCaptureCoreX: Added video overlays API. The Overlay Manager Demo shows how to use it. * [Video Capture SDK .Net] Improved HW encoder detection. If you have several GPUs, sometimes only the major GPU can be used for video encoding. * [Cross-platform SDKs] Updated Avalonia VideoView. Resolved issue with VideoView recreation. * [Media Player SDK .Net] MediaPlayerCoreX: Resolved startup issue with the Android version of the MediaPlayerCoreX engine. * [Media Player SDK .Net] MediaPlayerCore: Video_Stream_Index property has been replaced with Video_Stream_Select/Video_Stream_SelectAsync methods. * [Media Player SDK .Net] MediaPlayerCoreX: Added Video_Stream_Select method. * [Video Capture SDK .Net] VideoCaptureCore: Network_Streaming_WMV_Maximum_Clients property moved to WMVOutput class. You can set the maximum number of clients for network WMV output. * [All] Updated WPF rendering. Improved performance for 4K and 8K videos. * [Video Capture SDK .Net] VideoCaptureCoreX: Resolved issue with multiple outputs used. * [Video Capture SDK .Net] VideoCaptureCoreX: Resolved issue with OnAudioFrameBuffer event. * [Video Capture SDK .Net] Decklink source changed to improve startup speed. The Decklink_CaptureDevices method has been replaced by async Decklink_CaptureDevicesAsync. * [Media Player SDK .Net] MediaPlayerCoreX: Added Custom_Video_Outputs/Custom_Audio_Outputs properties to set custom video/audio renderers * [Media Player SDK .Net] MediaPlayerCoreX: Added Decklink Output Player Demo (WPF) * [Video Edit SDK .Net] Added Multiple Audio Tracks Demo (WPF) * [Video Edit SDK .Net] Updated MP4 output for multiple audio tracks * [Cross-platform SDKs] Updated device enumerator * [Video Capture SDK .Net] Resolved issue with VU meter in cross-platform engine * [Cross-platform SDKs] Resolved issue with VU Meter (event not fired) * [Media Player SDK .Net] Updated memory playback * [ALL] Added IAsyncDisposable interface support for cross-platform core classes. It should be used to dispose of the core objects in async methods. * [Video Capture SDK .Net] Added madVR support for mutiscreen * [Video Capture SDK .Net] Resolved NDI enumerating issue in the VideoCaptureCore engine * [Media Player SDK .Net] Added madVR Demo * [Video Capture SDK .Net] Added madVR Demo * [ALL] Resolved madVR issues in all SDKs * [Media Blocks SDK .Net] Added NDI Source demo * [Video Capture SDK .Net] Added NDI support for cross-platform engine * [ALL] Resolve the "image not found" issue with the WinUI NuGet package * [Media Blocks SDK .Net/Media Player SDK .Net (cross-platform)] Added MP3+CDG Karaoke Player demo * [Media Blocks SDK .Net] Added CDGSourceBlock for MP3+CDG karaoke files playback * [ALL] Improved madVR support * WinUI VideoView updated to fix issues during audio file playback * [Video Capture SDK .Net] Improved VNC source support for the VideoCaptureCoreX engine. * [Video Capture SDK .Net] Added VNC source support for the VideoCaptureCoreX engine. You can use VNCSourceSettings class to configure Video_Source. * [Media Blocks SDK .Net] Added VNC source support. You can use the VNCSourceBlock class as a video source block. * [Video Capture SDK .Net] Video_Resize property has been changed to IVideoResizeSettings type. You can use the VideoResizeSettings class to perform classic resize the same as before or use MaxineUpscaleSettings/MaxineSuperResSettings to perform AI resizing on Nvidia GPU using Nvidia Maxine SDK (SDK or SDK models are required to deploy). * [ALL] Resolved issues with NDI source detection in the local network * [ALL] Added KLVParser class to read and decode data from KLV binary files. * [ALL] Added KLVFileSink block. You can export KLV data from MPEG-TS files. * [Media Blocks SDK .Net] Added KLV demo. * [Video Capture SDK .Net] Added MJPEG network streamer. * [ALL] Added WASAPI 2 support. * [Media Blocks SDK .Net] Updated Video Effects API. Added Grayscale media block. * [Media Blocks SDK .Net] Added Live Video Compositor API and sample. * [ALL] Updated Avalonia VideoView control. Resolved issues with video playback on Windows on HighDPI displays. * [Video Capture SDK .Net] Added CustomVideoFrameRate property to MFOutput. You can set a custom frame rate if your source provides an incorrect frame rate (IP camera, for example). * [Video Capture SDK .Net] Updated NVENC encoder. Resolved issue with high-definition video capture. * [Video Capture SDK .Net] Resolved issue with TV Tuning on Avermedia devices * [Media Blocks SDK .Net] Added OpenCV blocks: CVDewarp, CVDilate, CVEdgeDetect, CVEqualizeHistogram, CVErode, CVFaceBlur, CVFaceDetect, CVHandDetect, CVLaplace, CVMotionCells, CVSmooth, CVSobel, CVTemplateMatch, CVTextOverlay, CVTracker * [CV] Resolved the issue with wrong face coordinates. * [CV, Media Blocks SDK .Net] Added Face Detector block. * [Media Blocks SDK .Net] Added rav1e AV1 video encoder. * [Media Blocks SDK .Net] Added GIF video encoder. * [Media Blocks SDK .Net] Added NDI Sink and NDI source blocks. * [ALL] Resolved NDI SDK detection issues. * [Media Blocks SDK .Net] Updated Speex encoder. * [Media Blocks SDK .Net] Updated Video Mixer block. * [ALL] Added Save/Load methods for output format to serialize into JSON. * [Media Blocks SDK .Net] Added MJPEG HTTP Live streaming sink block. * [ALL] Resolved MP4 HW QSV H264 regression. * [ALL] WinForms and WPF VideoView stability updates. * [Media Player SDK .Net] Removed FilenamesOrURL legacy property. Please use the `Playlist` API instead. * [Media Blocks SDK .Net] Added fade-in/out feature for image overlay block. * [ALL] Telemetry update * [ALL] SDKs updated to use the `ObservableCollection` instead of the `List` in public API. * [ALL] Updated MP4 HW output. Improved NVENC performance. * [Media Blocks SDK .Net] Added Video Compositor sample. * [Media Blocks SDK .Net] Added YouTubeSink and FacebookLiveSink blocks with custom YouTube/Facebook configurations. The `RTMPSink` can stream to YouTube/Facebook in the same way as before. * [Media Blocks SDK .Net] Added SqueezeBack video mixer block. * [ALL] Updated scrolling text logo. We've added the Preload method to render a text overlay before playback. * [ALL] Updated scrolling text logo (performance) * [Media Blocks SDK .Net] Updated Decklink sink blocks * [ALL] Resolved crashes with a text logo with a custom resolution * [Media Blocks SDK .Net] Added Intel QuickSync H264, HEVC, VP9, and MJPEG encoders support. * [Video Edit SDK .Net] Added FastEdit_ExtractAudioStreamAsync method to extract the audio stream from the video file. * [Video Edit SDK .Net] Added "Audio Extractor" WinForms sample. * [Media Blocks SDK .Net] Updated MP4SinkBlock. The sink can split output files by duration, file size, or timecode. Use MP4SplitSinkSettings instead of MP4SinkSettings to configure. * [Video Capture SDK .Net] Added the OnMJPEGLowLatencyRAWFrame event that fired when the MJPEG low latency engine received a RAW frame from a camera. * [Media Blocks SDK .Net] Added VideoEffectsBlock to use video effects, available in Windows SDKs * [Media Blocks SDK .Net] Updated Decklink source * [Media Blocks SDK .Net] Added Decklink Demo (WPF) * [ALL] Resolved the DeinterlaceBlend video effect crash * [ALL] Used 3rd-party libraries moved to VisioForge.Libs.External assembly/NuGet * [ALL] Added Nvidia Maxine Video Effects SDK (BETA) and sample app for Media Player SDK .Net and Video Capture SDK .Net * [Video Capture SDK .Net] Added Decklink_Input_GetVideoFramesCount/Decklink_Input_GetVideoFramesCountAsync API to get total and dropped frames for the Decklink source * [ALL] VisioForge HW encoders update ## 15.5 * .Net 7 support * Added NetworkDisconnect event support to MJPEG Low Latency IP camera engine * Added Linux support for the VideoEditCoreX-based demos * Added OnRTSPLowLatencyRAWFrame event to get RAW frames from RTSP stream, using RTSP Low Latency engine * Added AutoTransitions property to the VideoEditCoreX engine * System.Drawing.Rectangle and System.Drawing.Size types are replaced by VisioForge.Types.Rectangle and VisioForge.Types.Size in all crossplatform APIs * MAUI samples (BETA) are added * Improved compatibility with Snap Camera for MP4 HW encoding * Online licensing updated * Added Camera Light demo * Added segments support in Media Player SDK .Net (Cross-platform engine) * Added Playlist API in Media Player SDK .Net (Windows-only engine) * Resolved issues with the "rtsp_source_create_audio_resampler" call in the RTSP Low Latency engine in Video Capture SDK .Net (Windows-only engine) * Added support for multiple Decklink outputs in Video Capture SDK .Net and Video Edit SDK .Net (Windows-only engine) * Resolved issues with the reverse playback engine in Media Player SDK .Net (Windows-only engine) * ONVIFControl and other ONVIF-related APIs are available for all platforms * API breaking change: the frame rate changed from double to VideoFrameRate in all APIs * Added GPU HW decoding for VLC engine * Resolved issue with WPF HighDPI apps that use EVR * Resolved issue with MediaPlayerCore.Video_Renderer_SetCustomWindowHandle method * Added previous frame playback in Media Player SDK .Net (Cross-platform engine) * Added WPF Screen Capture Demo to Media Blocks SDK .Net ## 15.4 * Resolved an issue with ignored Play_PauseAtFirstFrame property * Updated HighDPI support in WinForms samples * Resolved an issue with HighDPI support for the Direct2D video renderer * Added additional API to ONVIFControl class: GetDeviceCapabilities, GetMediaEndpoints * Resolved forced reencoding issue with FFMPEG files joining without reencoding * Sentry update * Added video interpolation settings for Zoom and Pan video effects * Added GtkSharp UI framework support for video rendering * FastEdit API has been changed to async * Resolved screen flip issue with Video_Effects_AllowMultipleStreams property of Video Capture SDK .Net core * Updated RTSP MultiView demo (added GPU decoding, added RAW stream access) * Added OnLoop event into Media Player SDK .Net * Added Loop feature into Media Blocks SDK .Net * Avalonia VideoView was downgraded to 0.10.12 because of Avalonia UI problems with NativeControl * Added File Encryptor demo for Video Edit SDK .Net ## 15.3 * App start-up time improved for PCs with Decklink cards * NDI SDK v5 support * Resolved an issue with MKV Legacy output (wrong cast exception). * Zoom and pan effects performance optimizations * Added basic Media Blocks API (WIP) * Added HLS network streaming to Video Edit SDK .Net * Added Rotate property to WPF VideoView. You can rotate the video by 90, 180, or 270 degrees. Also, you can use the GetImageLayer() method to get the Image layer and apply custom transforms * API change - FilterHelpers renamed to FilterDialogHelper * VisioForge.Types and VisioForge.MediaFramework assemblies merged into VisioForge.Core * UI classes moved to VisioForge.Core.UI.* assemblies and independent NuGet packages * VisioForge.Types renamed to VisioForge.Core.Types * VisioForge.Core no longer depends on the Windows Forms framework ## 15.2 * Added HorizontalAlignment and VerticalAlignment properties to the text and image logos * Updated ONVIF support, resolved an issue with username and password specified in URL but not specified in source settings * Resolved an issue with the FFMPEG.exe output dialog * Resolved an issue with the separate capture in a service applications * SDK migrated to System.Text.Json from NewtonsoftJson * Updated DirectCapture output for IP cameras * Video processing performance optimizations * IPCameraSourceSettings.URL property type changed from string to a `System.Uri` * Added DirectCapture ASF output for IP cameras ## 15.1 * Disabled Sentry debug messages in the console * Added Icecast streaming * VideoStreamInfo.FrameRate property type changed to VideoFrameRate (with numerator and denominator) from double * Updated WPF VideoView, resolved the issue for IP camera stream playback * API breaking change: `VisioForge.Controls`, `VisioForge.Controls.UI`, `VisioForge.Controls.UI.Dialogs`, and `VisioForge.Tools` assemblies are merged inside the `VisioForge.Core` assembly * Audio effect API now uses string name instead of index * Added Android support in Media Player SDK .Net * Added a new GStreamer-based cross-platform engine to support Windows and other platforms within the v15 development cycle ## 15.0 * Added StatusOverlay property for VideoCapture class. Assign the `TextStatusOverlay` object to this property to add text status overlay, for example, to show "Connecting..." text during IP camera connecting. * RTSP Live555 IP camera engine has been removed. Please use RTSP Low Latency or FFMPEG engines. * Resolved SDK_Version possible issue. * Added Settings_Load API. You can load the settings file saved by Settings_JSON. Be sure that device names are correct. * Resolved issue with an exception if separate capture started before Start/StartAsync method call. * RTP support for the VLC source engine. * API breaking change: SDK_State property has been removed. We do not have TRIAL and FULL SDK versions anymore. * API breaking change: DirectShow_Filters_Show_Dialog, DirectShow_Filters_Has_Dialog, Audio_Codec_HasDialog, Audio_Codec_ShowDialog, Video_Codec_HasDialog, Video_Codec_ShowDialog, Filter_Supported_LAV, Filter_Exists_MatroskaMuxer, Filter_Exists_OGGMuxer, Filter_Exists_VorbisEncoder, Filter_Supported_EVR, Filter_Supported_VMR9 and Filter_Supported_NVENC has been moved to VisioForge.Tools.FilterHelpers class. * The `VFAudioStreamInfo`/`VFVideoStreamInfo` classes use the `Timespan` for the duration. * Decklink types from VisioForge.Types assembly moved to VisioForge.Types.Decklink namespace. * Telemetry updated. * Custom redist loader updated. * NDI update. * API breaking change: The `Status` property was renamed to the `State`. The property type is `PlaybackState` in all SDKs. * API breaking change: UI controls split into Core (VideoCaptureCore, MediaPlayerCore, VideoEditCore) and VideoView. * API breaking change: Video_CaptureDevice... properties merged into Video_CaptureDevice property of VideoCaptureSource type. * API breaking change: Audio_CaptureDevice... properties merged into Audio_CaptureDevice property of AudioCaptureSource type. * API breaking change: In the Media Player SDK, the `Source_Stream` API properties were merged into the `Source_MemoryStream` property of the `MemoryStreamSource` type * Updated DVD playback * Updated FFMPEG source * API breaking change: Media Player SDK types moved from VisioForge.Types namespace to VisioForge.Types.MediaPlayer * API breaking change: Video Capture SDK types moved from VisioForge.Types namespace to VisioForge.Types.VideoCapture * API breaking change: Video Edit SDK types moved from VisioForge.Types namespace to "VisioForge.Types.VideoEdit" * API breaking change: Output types moved from VisioForge.Types namespace to VisioForge.Types.Output * API breaking change: Video Effects types moved from VisioForge.Types namespace to VisioForge.Types.VideoEffects * API breaking change: Audio Effects types moved from VisioForge.Types namespace to VisioForge.Types.AudioEffects * API breaking change: Event types moved from VisioForge.Types namespace to VisioForge.Types.Events * Added Video_Renderer_SetCustomWindowHandle method to set custom video renderer by Win32 window/control HWND handle ## 14.4 * Windows 11 support * Telemetry update * Resolved issues with Picture-in-Picture in 2x2 mode * Resolved issues with MJPEG Low Latency source in .Net 5/.Net 6/.Net Core 3.1 * Resolved issue with UDP network streaming for Decklink source * VFMP4v11Output renamed to VFMP4HWOutput * Added Microsoft H265 encoder support * Added Intel QuickSync H265 encoder support * Added OnDecklinkInputDisconnected/OnDecklinkInputReconnected events * Updated Decklink output * Resolved issues with Separate capture for MP4 HW, MOV, MPEG-TS, and MKVv2 outputs * Added Video_CaptureDevice_CustomPinName property. You can use this property to set a custom output pin name for a video capture device with several output video pins * Custom redist configuration updated * Updated IP camera RTSP Low Latency engine ## 14.3 * An issue with Video Resize filter creation for NuGet redists has been resolved * Telemetry update * Updated VFDirectCaptureMP4Output output * .Net 6 (preview) support * Nvidia CUDA removed. NVENC is a modern alternative and is available for H264/HEVC encoding. * IP camera MJPEG Low Latency engine has been updated * The NDI source listing has been updated * Improved ONVIF support * Added .Net Core 3.1 support for RTSP Low Latency source engine * Resolved issues with Picture-in-Picture for 2x2 mode * Split project and solutions by independent files for .Net Framework 4.7.2, .Net Core 3.1, .Net 5 and .Net 6 ## 14.2 * An issue with audio stream capture with enabled Virtual Camera SDK output was resolved * VFMP4v8v10Output was replaced with VFMP4Output * The "CanStart" method was added for Video_CaptureDevices items. The method returns true if the device can start and is not used exclusively in another app * Added async/await API to the ONVIFControl * An issue with wrong ColorKey processing in the Text Overlay video effect was resolved * Added forced frame rate support for the RTSP Low Latency IP camera source * MP4v11 AMD encoders were updated * The timestamp issue that happened during the MP4v11 separate capture pause/resume was resolved * FFMPEG.exe network streaming update * FFMPEG output was updated to the latest FFMPEG version * VC++ redist is no longer required to be installed. VC++ linking changed to static (except optional XIPH output) * Many base DirectShow filters moved to the VisioForge_BaseFilters module ## 14.1 * Added WPF VideoView control. You can push video frames from the OnVideoFrameBuffer event to control to render them * Correct default transparency value for a text logo * ONVIF support added to .Net 5 / .Net Core 3.1 builds * Added IP_Camera_ONVIF_ListSourcesAsync method to discover ONVIF cameras in the local network * (BREAKING API CHANGE) Changed video capture device API for frame rates enumerating to support modern 4K cameras * Updated MJPEG Decoder (improved performance) * Removed MP4 v8 legacy encoders * INotifyPropertyChanged support in WinForms/WPF wrappers to provide MVVM application support * Resolved issue with RTMPS streaming to Facebook * IP camera source added to the TimeShift demo * Added separate output support for MOV * Added fast-start FFMPEG flag for MP4v11 output that used FFMPEG MP4 muxer * Added GPU decoding for the IP Camera source in demo applications * Added CustomRedist_DisableDialog property to disable the redist message dialog * Removed Kinect assemblies and demos. Please contact us if you still need Kinect packages * MP4v10 default profile has been changed to Baseline / 5.0 for better browser compatibility ## 14.0 * .Net 5.0 support * Resolved issue with not visible Decklink sources in NuGet SDK version * Resolved issue with device added/removed notifier * Added alternative NDI source in Video Capture SDK .Net * Added NDI streaming (server) in Video Capture SDK .Net * Resolved Separate Capture usage issue for NuGet deployment * Resolved issue with merged text/image logos * Updated device notifier * Added CameraPowerLineControl class to control webcam power line frequency option * Legacy audio effects have been removed. * Removed HTTP_FFMPEG, RTSP_UDP_FFMPEG, RTSP_TCP_FFMPEG and RTSP_HTTP_FFMPEG from VFIPSource enumeration. You can use the Auto_FFMPEG value * Updated HLS server. Correct error reporting about used port * Added NDI streaming (server) in Video Edit SDK .Net * Added NDI streaming (server) in Media Player SDK .Net * Added IP_Camera_CheckAvailable method in Video Capture SDK .Net * Updated FFMPEG Source filter, more supported codecs, and added GPU decoding ## 12.1 * Migrated to .Net 4.6 * Added Debug_DisableMessageDialogs property to disable error dialog if OnError event is not implemented. * Fixed issue with resizing on the pause for WPF controls. * Updated ONVIF engine in Video Capture SDK .Net * Updated What You Hear source in Video Capture SDK .Net * Added OnPause/OnResume events * Updated YouTube demo in Media Player SDK .Net * Improved support of webcams with integrated H264 encoder in Video Capture SDK .Net * Updated VLC source * Removed unwanted warning in MP4 v11 output * One installer for TRIAL and FULL versions * Same NuGet packages for TRIAL and FULL versions * .Net Core NuGet packaged merged with .Net Framework package * Added NuGet redists. Deployment was never so simple! ## 12.0 * Async / await API for all SDKs * Breaking API change: All time-related API now uses TimeSpan instead of long (milliseconds) * Tag reader/writer - correct logo loading for some video formats * Removed legacy DirectX 9 video effects * Fixed audio conversion progress issue in Video Edit SDK .Net * Improved .Net Core compatibility * Virtual Camera SDK output added to Media Player SDK .Net (as one of the video renderers) * NewTek NDI devices support added to Video Capture SDK .Net as a new engine for IP cameras * Added Video_Effects_MergeImageLogos and Video_Effects_MergeTextLogos properties. If you have three or more logos, you can set these properties to true to optimize video effects' performance * Added playlist type option for HLS network streaming * Added integrated lightweight HTTP server for HLS network streaming * Added VR 360° video support in Media Player SDK .Net * Improved DirectX 11 video processing * Added MPEG-TS AAC-only no video support for MPEG-TS output * Improved What You Hear audio source * Several new demo applications * Improved MP4 v11 output * Separate capture for MP4 v11 can split files without frame lose * Many minor bugfixes * .Net Core assemblies updated to .Net Core 3.1 LTS * Updated demos repository on GitHub ## 11.4 * Added ASP.Net MVC video conversion demo app to Video Edit SDK .Net * Alternative OSD implementation to handle Windows 10 changes * Updated GPU video effects * Updated memory source in Media Player SDK .Net * Updated OSD API * Resolved issues with video encryption using binary keys * Update screen capture demos for Video Capture SDK .Net, added window selection to capture. You can capture any window, including windows in the background * Mosaic effect added for Computer Vision demo in Video Capture SDK .Net * Added Multiple IP Cameras Demo (WPF) in Video Capture SDK .Net * Added custom video resize option for MP4v11 output * Merge module (MSM) redists added to all SDKs * Updated FFMPEG.exe output using pipes instead of virtual devices * Resolved issue with PIP custom output resolution option in Video Capture SDK .Net * Resolved issue with file lock using LAV engine in Media Player SDK .Net * Added DirectX11-based GPU video processing ## 11.3 * Resolved issue with audio renderer connection if Virtual Camera SDK output enabled in Video Capture SDK * Improved subtitles support with autoloading in Media Player SDK .Net * Updated audio fade-in/fade-out effects * Added MIDI and KAR files support in Media Player SDK .Net * Added CDG karaoke files support (and new demo application) in Media Player SDK .Net * Added Async playback in Media Player SDK .Net * Updated integrated JSON serializer * Added optional GPU decoding in Media Player SDK .Net. Available decoding engines: DXVA2, Direct3D 11, nVidia CUVID, Intel QuickSync * Added .Net Core 3.0 support, including WinForms and WPF demo apps (Windows only) ## 11.2 * Added Loop property to Video Edit SDK .Net * Updated audio enhancer * Updated RTSP Low Latency source * Resolved crop issue for Decklink source * Added property to use TCP or UDP in RTSP Low Latency engine * Deployment without COM registration and admin rights for Video Edit SDK and Media Player SDK (BETA) * Updated video mixer with improved performance * Added YouTube playback code snippet * Added method to move OSD ## 11.1 * Fixed seeking issue with some MP4 files in Video Edit SDK * Fixed stretch/letterbox issue in the WPF version of all SDKs * Fixed issue with an equalizer on sample rate 16000 or less * Fixed problem with sample grabber for DirectShow source in Media Player SDK * Fixed encrypted files playback in Media Player SDK * Added DVDInfoReader class to read info about DVD files * Resolved issue with wrong file name in OnSeparateCaptureStopped event * Improved barcode detection quality for rotated images * The minimal .Net Framework version is .Net 4.5 now * Improved YouTube playback in Media Player SDK. Added OnYouTubeVideoPlayback event to select video quality for playback * Added the `Play_PauseAtFirstFrame` property to the Media Player SDK .Net. If true playback will be paused on the first frame * Multiple screen support in Screen Capture demo in Video Capture SDK .Net * Resolved issue with network stream playback in Media Player SDK .Net WPF applications * Added low latency HTTP MJPEG stream playback (IP cameras or other sources) in Video Capture SDK .Net * Added Fake Audio Source DirectShow filter, which produces a tone signal * Updated Computer Vision demo in Video Capture SDK .Net * Added Frame_GetCurrentFromRenderer method to all SDKs. Using this method, you can get the currently rendered video frame directly from the video renderer. * Added low latency RTSP source playback in Video Capture SDK .Net ## 11.0 * Fixed bug with MP4 v11 output, custom GOP settings * Updated MJPEG Decoder * Fixed bug with MP4 v11 output. Added Windows 7 full support * OnStop event of Video Edit SDK returns a successful status * Video Capture SDK Main Demo update - multiscreen can automatically use connected external displays * Media Player SDK Main Demo update - multiscreen can automatically use connected external displays * Added fade-in / fade-out for text logo * Updated Decklink output * Video Edit SDK can fast-cut files from network sources (HTTP/HTTPS) * Added Computer Vision demo, with cars/pedestrian counter and face/eyes/nose/mouth detector/tracker * Updated MP4 output to use alternative muxer that provides constant frame rate * Added MPEG-TS output * Added MOV output ---END OF PAGE--- # Local File: .\dotnet\index.md --- title: .NET SDKs for Video & Media Development description: Professional .NET SDKs for video capture, editing, playback and media processing. Cross-platform support for Windows, macOS, Linux, Android and iOS with hardware acceleration for optimal performance. sidebar_label: .Net SDKs order: 20 icon: ../static/dotnet.svg route: /docs/dotnet/ --- # .NET SDKs for Professional Media Development ## Introduction to Our .NET SDKs Our powerful .NET SDKs empower developers to seamlessly integrate advanced video capture, sophisticated video editing, smooth playback, and efficient media processing capabilities into their software applications. These professionally engineered tools provide a complete solution for all your multimedia development needs. ## Multi-Platform Compatibility All our .NET SDKs are designed with cross-platform functionality in mind, providing robust support across: - Windows desktop environments - Linux distributions - macOS systems - Android mobile devices - iOS applications This versatility ensures your media applications can reach users on virtually any platform without compromising functionality. ## Hardware Acceleration Technologies Our SDKs leverage cutting-edge GPU-accelerated encoding and decoding technologies to maximize performance: ### Desktop Platforms - Intel Quick Sync Video for efficient hardware acceleration - NVIDIA NVENC for superior encoding performance - AMD VCE (Video Coding Engine) for optimized processing ### Mobile Platforms - Native hardware encoding and decoding capabilities - Performance-optimized implementations for battery efficiency ## Getting Started Resources ### SDK Usage Tutorials - [Installation Guide](install/index.md) - Step-by-step setup instructions - [SDK Initialization](init.md) - Proper initialization procedures - [System Requirements](system-requirements.md) - Detailed compatibility information ## Available SDK Products ### [Video Capture SDK .NET](videocapture/index.md) Efficiently capture high-quality video from multiple sources including: - Webcams and USB cameras - Network IP cameras - HDMI capture devices - Screen recording - Custom video sources ### [Video Edit SDK .NET](videoedit/index.md) Professional video editing capabilities including: - Timeline-based video editing - Filter and effect application - Video montage creation - Format conversion - Frame-accurate editing ### [Media Player SDK .NET](mediaplayer/index.md) Feature-rich media playback functionality: - Multi-format video and audio playback - Real-time effect application - Customizable player interfaces - Streaming support - Advanced control options ### [Media Blocks SDK .NET](mediablocks/index.md) Modular building blocks for creating: - Custom multimedia applications - Specialized media processing tools - Cross-platform media solutions - Integrated workflow systems ## Additional Developer Resources - [Changelog](changelog.md) - Detailed version history and updates - [Feature and Platform Matrix](platform-matrix.md) - Compatibility overview - [API Reference Documentation](https://api.visioforge.org/dotnet/api/index.html) - Complete API specifications ---END OF PAGE--- # Local File: .\dotnet\init.md --- title: .NET SDK Setup and Configuration Guide description: Learn how to properly initialize and deinitialize .NET SDKs for video capture, editing, and media playback. Includes code examples for both Windows-only and cross-platform X-engines, with best practices for clean application exit. sidebar_label: Initialization order: 20 --- # Initialization [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Type of SDK engines All SDKs contain Windows-only DirectShow-based engines and cross-platform X-engines. ### Windows-only engines - VideoCaptureCore - VideoEditCore - MediaPlayerCore ### X-engines - VideoCaptureCoreX - VideoEditCoreX - MediaPlayerCoreX - MediaBlocksPipeline X-engines require additional initialization and de-initialization steps. ## SDK initialization and de-initialization for X-engines You need to initialize SDK before any SDK class usage and de-initialize SDK before the application exits. To initialize SDK, use the following code: ```csharp VisioForge.Core.VisioForgeX.InitSDK(); ``` To de-initialize SDK, use the following code: ```csharp VisioForge.Core.VisioForgeX.DestroySDK(); ``` If the SDK is not properly deinitialized, the application may experience a hang-on exit due to the inability to finalize one of its threads. This issue arises because the SDK continues to operate, preventing the application from closing smoothly. To ensure a clean exit, it is crucial to deinitialize the SDK appropriately based on the UI framework you are using. For applications developed using different UI frameworks, you can deinitialize the SDK in the `FormClosing` event or another relevant event handler. This approach ensures that the SDK is properly destroyed before the application closes, allowing for all threads to terminate correctly. Moreover, the SDK can be destroyed from any thread, providing flexibility in how you manage the deinitialization process. To enhance the user experience and prevent the UI from freezing during this process, you can utilize asynchronous API calls. By using async methods, you allow the deinitialization to occur in the background, keeping the user interface responsive and avoiding any potential lag or freezing issues. Implementing these practices ensures that your application exits smoothly without hanging, providing a seamless experience for the users. Properly managing the SDK deinitialization is crucial for maintaining the stability and performance of your application. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\platform-matrix.md --- title: .NET SDKs - Platform & Feature Matrix description: Explore .NET SDK cross-platform support - video/audio codecs, hardware acceleration, capture devices & network protocols on Windows, Linux, macOS, Android, iOS. sidebar_label: Platform & Feature Matrix order: 17 --- # .NET SDK: Supported Features and Platforms [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) Discover the comprehensive feature set and broad platform compatibility of VisioForge .NET SDKs. The tables below detail supported input/output formats, video/audio codecs, hardware acceleration, capture devices, and network protocols across Windows, Linux, macOS, Android, and iOS. ## Input and output file formats | Output formats | Windows | Linux | MacOS | Android | iOS | |----------------|:--------:|:--------:|:-------:|:--------:|:--------:| | MP4 | ✔ | ✔ | ✔| ✔ | ✔ | | WebM | ✔ | ✔ | ✔| ✔ | ✔ | | MKV | ✔ | ✔ | ✔| ✔ | ✔ | | AVI | ✔ | ✔ | ✔| ✔ | ✔ | | ASF (WMV/WMA) | ✔ | ✔ | ✔| ✔ | ✔ | | MPEG-PS | ✔ | ✔ | ✔| ✔ | ✔ | | MPEG-TS | ✔ | ✔ | ✔| ✔ | ✔ | | MOV | ✔ | ✔ | ✔| ✔ | ✔ | | MXF | ✔ | ✔ | ✔| ✔ | ✔ | | WMA | ✔ | ✔ | ✔| ✔ | ✔ | | WAV | ✔ | ✔ | ✔| ✔ | ✔ | | MP3 | ✔ | ✔ | ✔| ✔ | ✔ | | OGG | ✔ | ✔ | ✔| ✔ | ✔ | Also, cross-platform engines support all formats supported by FFMPEG and GStreamer. ## Video encoders and decoders SDK supports the following video codecs: | Encoders | Windows | Linux | MacOS | Android | iOS | |------------|:--------:|:--------:|:-------:|:--------:|:--------:| | H264 | ✔ | ✔ | ✔| ✔ | ✔ | | H264/HEVC | ✔ | ✔ | ✔| ✔ | ✔ | | VP8/VP9 | ✔ | ✔ | ✔| ✔ | ✔ | | AV1 | ✔ | ✔ | ✔| ✔ | ✔ | | MJPEG | ✔ | ✔ | ✔| ✔ | ✔ | | WMV | ✔ | ✔ | ✔| ✔ | ✔ | | MPEG-4 ASP | ✔ | ✔ | ✔| ✔ | ✔ | | GIF | ✔ | ✔ | ✔| ✔ | ✔ | | MPEG-1 | ✔ | ✔ | ✔| ✔ | ✔ | | MPEG-2 | ✔ | ✔ | ✔| ✔ | ✔ | | Theora | ✔ | ✔ | ✔| ✔ | ✔ | | DNxHD | ✔ | ✔ | ✔| ✔ | ✔ | | DV | ✔ | ✔ | ✔| ✔ | ✔ | ### GPU-accelerated encoding and decoding The table below shows the support for hardware-accelerated encoding and decoding for each codec and platform. | Codec | Hardware | Windows | Linux | MacOS | Android | iOS | |-----------|:-----------:|:--------:|:--------:|:-------:|:---------:|:--------:| | H264/HEVC | Intel | D / E | D / E | D / E | ✘ | ✘ | | H264/HEVC | Nvidia | D / E | D / E | D / E | ✘ | ✘ | | H264/HEVC | AMD | D / E | D / E | D / E | ✘ | ✘ | | H264/HEVC | Apple | ✘ | ✘ | D / E | ✘ | D / E | | H264/HEVC | Android (1) | ✘ | ✘ | ✘| D / E | ✘ | | AV1 | Intel | D / E | D / E | D / E | ✘ | ✘ | | AV1 | Nvidia | D / E | D / E | D / E | ✘ | ✘ | | AV1 | AMD | D / E | D / E | D / E | ✘ | ✘ | | AV1 | Apple | ✘ | ✘ | D | ✘ | D | | AV1 | Android (1) | ✘ | ✘ | ✘| D | ✘ | | VP9 | Intel | D / E | D / E | D / E | ✘ | ✘ | | VP9 | Nvidia | D / E | D / E | D / E | ✘ | ✘ | | VP9 | AMD | D / E | D / E | D / E | ✘ | ✘ | | VP9 | Apple | ✘ | ✘ | D (2) | ✘ | ✘ | | VP9 | Android (1) | ✘ | ✘ | ✘| D / E | ✘ | (1) - MediaCodec compatible encoders and decoders, if supported by hardware (2) - only on Apple Silicon ## Audio encoders and decoders The table below shows the support for audio codecs for each platform. | Encoders | Windows | Linux | MacOS | Android | iOS | |----------|:--------:|:--------:|:-------:|:--------:|:--------:| | AAC | ✔ | ✔ | ✔| ✔ | ✔ | | MP3 | ✔ | ✔ | ✔| ✔ | ✔ | | Vorbis | ✔ | ✔ | ✔| ✔ | ✔ | | OPUS | ✔ | ✔ | ✔| ✔ | ✔ | | Speex | ✔ | ✔ | ✔| ✔ | ✔ | | FLAC | ✔ | ✔ | ✔| ✔ | ✔ | | MP2 | ✔ | ✔ | ✔| ✔ | ✔ | | WMA | ✔ | ✔ | ✔| ✔ | ✔ | | OPUS | ✔ | ✔ | ✔| ✔ | ✔ | | Wavpack | ✔ | ✔ | ✔| ✔ | ✔ | Also, you can use any other audio or video encoder available in FFMPEG or GStreamer. ## Devices The table below shows the support for capture devices for each platform. | Devices | Windows | Linux | MacOS | Android | iOS | |-----------------------------------------|:--------:|:--------:|:-------:|:--------:|:--------:| | Webcams and other local capture sources | ✔ | ✔ | ✔| ✔ | ✔ | | IP cameras and NVR (including ONVIF) | ✔ | ✔ | ✔| ✔ | ✔ | | Screen | ✔ | ✔ | ✔| ✔ | ✔ | | Blackmagic Decklink (input and output) | ✔ | ✔ | ✔| ✘ | ✘ | | Camcorders | ✔ | ✔ | ✔| ✘ | ✘ | | GenICam-supported USB3/GigE cameras | ✔ | ✔ | ✔| ✘ | ✘ | | Teledyne/FLIR GigE/USB3 cameras | ✔ | ✘ | ✘| ✘ | ✘ | | Basler GigE/USB3 cameras | ✔ | ✘ | ✘| ✘ | ✘ | | Allied Vision GigE/USB3 cameras | ✔ | ✘ | ✘| ✘ | ✘ | ## Network protocols The table below shows the support for network protocols for each platform. | Protocols | Windows | Linux | MacOS | Android | iOS | |-------------------------------|:--------:|:--------:|:-------:|:--------:|:--------:| | RTP/RTSP | ✔ | ✔ | ✔| ✔ | ✔ | | RTMP (YouTube, Facebook Live) | ✔ | ✔ | ✔| ✔ | ✔ | | SRT | ✔ | ✔ | ✔| ✔ | ✔ | | UDP | ✔ | ✔ | ✔| ✔ | ✔ | | TCP | ✔ | ✔ | ✔| ✔ | ✔ | | HTTP | ✔ | ✔ | ✔| ✔ | ✔ | | NDI | ✔ | ✔ | ✔| ✔ | ✔ | | VNC (source) | ✔ | ✔ | ✔| ✔ | ✔ | | GenICam (source) | ✔ | ✔ | ✔| ✔ | ✔ | | AWS S3 | ✔ | ✔ | ✔| ✔ | ✔ | ---END OF PAGE--- # Local File: .\dotnet\system-requirements.md --- title: .NET SDK Platform Requirements & Compatibility Guide description: Detailed technical guide covering platform support, system requirements, and framework compatibility for .NET SDKs across Windows, macOS, Linux, iOS, and Android. Includes deployment specifications for desktop and mobile development. sidebar_label: System Requirements --- # System Requirements for .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) This guide details the system requirements and platform compatibility for VisioForge's suite of .NET SDKs, designed for high-performance video processing and playback applications. ## Overview Unlock powerful cross-platform video capabilities with VisioForge .NET SDKs, fully compatible with Windows, Linux, macOS, Android, and iOS. Our SDKs provide robust support for .NET Framework, .NET Core, and modern .NET 5+ (including .NET 8 LTS & .NET 9), enabling seamless integration with WinForms, WPF, WinUI 3, Avalonia, .NET MAUI, and Xamarin. Develop high-performance video applications with familiar C# paradigms across all major operating systems and UI frameworks. > **Important Note**: While Windows users benefit from our dedicated installer package, developers working on other platforms should utilize the NuGet package distribution method for implementation. ## Development Environment Requirements The following sections outline the specific requirements for setting up your development environment when working with our SDKs. ### Operating Systems for Development Development of applications using our SDKs is supported on the following platforms: #### Windows * Windows 10 (all editions) * Windows 11 (all editions) * Recommended: Latest feature update with current security patches #### Linux * Ubuntu 22.04 LTS or newer * Debian 11 or newer * Other distributions with equivalent libraries may work but are not officially supported #### macOS * macOS 12 (Monterey) or newer * Apple Silicon (M1/M2/M3) and Intel processors supported ### Hardware Requirements For optimal development experience, we recommend: * Processor: 4+ cores, 2.5 GHz or faster * RAM: 8 GB minimum, 16 GB recommended for complex projects * Storage: SSD with at least 10 GB free space * Graphics: DirectX 11 compatible GPU (Windows) or Metal-compatible GPU (macOS) ## Target Deployment Platforms Our SDKs can be deployed to a variety of platforms, enabling wide-reaching distribution of your applications. ### Desktop Platforms #### Windows * Windows 10 (version 1809 or newer) * Windows 11 (all versions) * Both x86 and x64 architectures supported * ARM64 support for Windows on ARM devices #### Linux * Ubuntu 22.04 LTS or newer * Other distributions require equivalent libraries and dependencies * x64 and ARM64 architectures supported #### macOS * macOS 12 (Monterey) or newer * Both Intel and Apple Silicon architectures supported natively * Rosetta 2 not required for Apple Silicon devices ### Mobile Platforms #### Android * Android 10 (API level 29) or newer * ARM, ARM64, and x86 architectures supported * Google Play Store compatible * Hardware-accelerated rendering recommended #### iOS * iOS 12 or newer versions * Compatible with iPhone, iPad, and iPod Touch * Supports both ARMv7 and ARM64 architectures * App Store distribution compatible ## .NET Framework Compatibility Our SDKs provide extensive compatibility with various .NET implementations: ### .NET Framework * .NET Framework 4.6.1 * .NET Framework 4.7.x * .NET Framework 4.8 * .NET Framework 4.8.1 ### Modern .NET * .NET Core 3.1 (LTS) * .NET 5.0 * .NET 6.0 (LTS) * .NET 7.0 * .NET 8.0 (LTS) * .NET 9.0 ### Xamarin (Legacy) * Xamarin.iOS 12.0+ * Xamarin.Android 9.0+ * Xamarin.Mac 5.0+ ## UI Framework Integration The SDKs integrate with a wide array of UI frameworks, enabling flexible application development: ### Windows-Specific Frameworks * Windows Forms (WinForms) * .NET Framework 4.6.1+ and .NET Core 3.1+ * High-performance rendering options * Supports designer integration * Windows Presentation Foundation (WPF) * .NET Framework 4.6.1+ and .NET Core 3.1+ * Hardware-accelerated rendering * XAML-based layout with binding support * Windows UI Library 3 (WinUI 3) * Desktop applications only * Modern Fluent Design components * Windows App SDK integration ### Cross-Platform Frameworks * .NET MAUI * Unified development for Windows, macOS, iOS, and Android * Shared UI code across platforms * Native performance with shared codebase * Avalonia UI * Truly cross-platform UI framework * XAML-based with familiar paradigms * Windows, Linux, macOS compatible ### Mobile-Specific Frameworks * iOS Native UI * UIKit integration * SwiftUI compatibility layer * Storyboard and XIB support * macOS / Mac Catalyst * AppKit and UIKit integration * Mac Catalyst for iPad app adaptation * Native macOS UI elements * Android Native UI * Integration with Android UI toolkit * Support for Activities and Fragments * Material Design components compatibility ## Distribution Methods ### NuGet Packages Our SDKs are available as NuGet packages, simplifying integration with your development workflow. ### Windows Setup For Windows developers, we offer a dedicated installer package that includes: * SDK binaries and dependencies * Documentation and example projects * Visual Studio integration components * Developer tools and utilities ## Performance Considerations ### Memory Requirements * Base memory footprint: ~50MB * Video processing: Additional 100-500MB depending on resolution and complexity * 4K video processing: 1GB+ recommended ### CPU Utilization * Video capture: 10-30% on a modern quad-core CPU * Real-time effects: Additional 10-40% depending on complexity * Hardware acceleration recommended for production environments ### Storage Requirements * SDK installation: ~250MB * Runtime cache: ~100MB * Temporary processing files: Up to several GB depending on workload ## Licensing and Deployment Check out our [Licensing](../licensing.md) page for more information on the different licensing options available for our SDKs. ## Technical Support Resources We provide extensive resources to ensure successful implementation: * API documentation with code examples * Implementation guides for various platforms * Troubleshooting and optimization tips * Direct support channels for licensed developers ## Code Samples and Examples Visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) for an extensive collection of code samples demonstrating SDK features and implementation patterns across supported platforms. ## Updates and Maintenance * Regular SDK updates with new features and optimizations * Security patches and bug fixes * Backward compatibility considerations * Migration guides for version transitions --- This technical specification document outlines the system requirements and compatibility matrix for our Video Capture SDK .Net and related products. For specific implementation details or custom integration scenarios, please contact our developer support team. ---END OF PAGE--- # Local File: .\dotnet\deployment-x\Android.md --- title: Cross-platform .Net deployment manual for Android description: Step-by-step guide for .NET developers implementing VisioForge SDKs in Android apps. Learn package management, architecture support, VideoView integration, and deployment best practices with code examples and troubleshooting tips sidebar_label: Android --- # Android Implementation and Deployment Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to VisioForge SDKs for Android Android developers working with .NET technologies can leverage the powerful capabilities of VisioForge SDKs to integrate advanced media functionality into their applications. The SDKs provide robust solutions for media manipulation, playback, capture, and editing on the Android platform using .NET technologies. The VisioForge SDK for Android offers powerful capabilities for video processing, capturing, editing, and playback, all optimized for the Android platform while maintaining a consistent cross-platform development experience. The Android deployment process requires special consideration for package management, device compatibility, permissions, and performance optimization. This document provides detailed instructions to ensure your application runs smoothly on Android devices. ## System Requirements Before beginning your Android implementation and deployment process, ensure your development environment meets the following requirements: ### Device Requirements - Android device running Android 10.0 or later - ARM or ARM64 processor architecture - Sufficient storage space for application assets and media processing - Camera and microphone hardware (if using video/audio capture features) ### Development Environment Requirements - Windows, Linux, or macOS computer - Visual Studio with .NET MAUI or Xamarin workloads installed, JetBrains Rider, or Visual Studio Code - .Net 8.0 SDK or later (latest stable version recommended) - Android SDK with appropriate API levels installed - Java Development Kit (JDK) 11 or later - Basic knowledge of .NET development for Android ## Architecture Support The VisioForge SDK for Android provides native support for common Android device architectures: ### ARM64 Support - Optimized for modern Android devices - Hardware-accelerated video processing - Enhanced performance for media operations - Primary target for most applications ### ARM/ARMv7 Support - Compatibility with older Android devices - Software fallbacks for hardware acceleration when needed - Balanced performance and compatibility approach ## Installation and Setup Process Follow these steps to properly set up and deploy your VisioForge-powered Android application: 1. Create a new Android project in your preferred IDE (Visual Studio or JetBrains Rider recommended). 2. Add the required NuGet packages to your project (detailed in the next section). 3. Configure necessary permissions in your AndroidManifest.xml file. 4. Implement your application logic using the VisioForge SDK components. 5. Build, sign, and deploy your application to test devices. ### NuGet Package Management The VisioForge SDK for Android is distributed through NuGet packages. Add the following packages to your Android project: - [VisioForge.CrossPlatform.Core.Android](https://www.nuget.org/packages/VisioForge.CrossPlatform.Core.Android) - Contains the redistribution components required for Android applications, including unmanaged libraries. You can add these packages using the NuGet Package Manager in your IDE or by adding the following to your project file: ```xml ``` Note: Replace version numbers with the latest available releases. ## Java Bindings Library Integration Android applications using VisioForge SDK require a custom Java Bindings Library for proper functionality. This essential step ensures proper communication between the .NET framework and Android's Java-based environment. Follow these detailed steps to integrate it: 1. Clone the binding library repository from our [GitHub page](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/AndroidDependency) 2. Based on your .NET version, add one of the following projects to your solution: - For .NET 9: `VisioForge.Core.Android.X9.csproj` - For .NET 8: `VisioForge.Core.Android.X8.csproj` 3. Add a reference to the helper library in your project's .csproj file: ```xml ``` > **Note:** Make sure to adjust the relative path to match your project structure ## Implementing VideoView in Your Application ### Adding VideoView to Your Layout The `VideoView` control is the primary interface for displaying video content in your Android application. To integrate it into your app, follow these steps: 1. Open your Activity or Fragment layout file (typically an `.axml` or `.xml` file) 2. Add the VideoView element as shown in the example below: ```xml ``` ### Initializing VideoView in Code After adding the VideoView to your layout, you'll need to initialize it in your Activity or Fragment code: ```csharp using VisioForge.Core.UI.Android; namespace YourApp { [Activity(Label = "VideoPlayerActivity")] public class VideoPlayerActivity : Activity { private VideoView _videoView; protected override void OnCreate(Bundle savedInstanceState) { base.OnCreate(savedInstanceState); SetContentView(Resource.Layout.your_layout); // Initialize the video view _videoView = FindViewById(Resource.Id.videoView); } } } ``` ## Performance Considerations Use physical Android devices for testing whenever possible. Simulators may not accurately represent real-world performance, especially for hardware-accelerated video operations. ## Application Signing and Publishing ### Application Signing For distributing your Android application, you need to sign it with a digital certificate: 1. Create a keystore file if you don't already have one: ```bash keytool -genkey -v -keystore your-app-key.keystore -alias your-app-alias -keyalg RSA -keysize 2048 -validity 10000 ``` 2. Configure signing in your project: Add the following to your `android/app/build.gradle` file: ```text android { ... signingConfigs { release { storeFile file("your-app-key.keystore") storePassword "your-store-password" keyAlias "your-app-alias" keyPassword "your-key-password" } } buildTypes { release { signingConfig signingConfigs.release ... } } } ``` For .NET MAUI or Xamarin.Android projects, configure signing in your .csproj file: ```xml True your-app-key.keystore your-store-password your-app-alias your-key-password ``` ### Publishing to Google Play Store 1. Generate an AAB (Android App Bundle) for distribution: ```bash dotnet build -f net8.0-android -c Release /p:AndroidPackageFormat=aab ``` 2. Create a developer account on the Google Play Console if you don't already have one. 3. Create a new application on the Google Play Console. 4. Upload your AAB file to the production track. 5. Complete the store listing information. 6. Submit for review. ## Troubleshooting ### Common Issues 1. **Missing Permissions**: Ensure all required permissions are declared in the AndroidManifest.xml and requested at runtime. 2. **Architecture Compatibility**: Verify your application supports the target device's architecture (ARM/ARM64). 3. **Memory Constraints**: Monitor memory usage and implement proper resource management. 4. **Performance Issues**: Use hardware acceleration and optimize media operations for mobile devices. 5. **Java Bindings Errors**: When facing issues with Java bindings: - Confirm you're using the correct binding library version - Check for version mismatches between .NET and the binding library - Verify all dependencies are properly referenced ### Getting Help If you encounter issues with your VisioForge SDK deployment on Android, please consult: - [Online Documentation](https://www.visioforge.com/help/) - [Support Portal](https://support.visioforge.com) - [GitHub Samples](https://github.com/visioforge/.Net-SDK-s-samples) ## Conclusion Implementing and deploying VisioForge SDK applications to Android devices requires careful attention to platform-specific considerations. By following the guidelines in this document, you can ensure a smooth development and deployment process and deliver high-quality video applications to your Android users. Remember to test thoroughly on target devices, especially for performance-intensive operations like video capture and processing. With proper implementation, the VisioForge SDK enables powerful media applications across the Android ecosystem. ---END OF PAGE--- # Local File: .\dotnet\deployment-x\computer-vision.md --- title: Computer Vision Implementation for Developers description: Learn how to implement and integrate powerful computer vision capabilities in your applications across multiple platforms. This guide covers deployment requirements, package installation, and platform-specific configurations for Windows, Linux, and macOS environments. sidebar_label: Computer Vision Deployment --- # Computer Vision Implementation Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net), [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net), [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Overview of Available Packages Our SDK provides two powerful NuGet packages that deliver robust computer vision capabilities for your applications: 1. **VisioForge CV Package**: Designed specifically for Windows environments 2. **VisioForge CVD Package**: Cross-platform solution that works across multiple operating systems These packages provide a comprehensive API for integrating computer vision features directly into your .NET applications. ## Deployment Requirements ### Windows-Specific CV Package #### Installation Process The Windows-specific CV package is designed for seamless integration: - Simply install the NuGet package through your preferred package manager - No additional deployment steps are necessary - Ready to use immediately after installation ### Cross-Platform CVD Package Our cross-platform CVD package requires specific configurations based on your operating system: #### Windows Environment Setup When deploying on Windows systems: - Install the NuGet package through Visual Studio or the .NET CLI - No additional dependencies or configurations are required - Works out of the box with standard Windows installations #### Ubuntu Linux Configuration For Ubuntu Linux systems, install the following dependencies: ```bash sudo apt-get install libgdiplus libopenblas-dev libx11-6 ``` These packages provide essential functionalities: - `libgdiplus`: Provides System.Drawing compatibility - `libopenblas-dev`: Optimizes matrix operations for computer vision algorithms - `libx11-6`: Handles X Window System protocol client library #### macOS Setup Instructions For macOS environments, use Homebrew to install the required dependencies: ```bash brew cask install xquartz brew install mono-libgdiplus ``` These components enable: - XQuartz: Provides X11 functionality on macOS - mono-libgdiplus: Ensures compatibility with System.Drawing ## Additional Resources For implementation examples and technical guidance: - Visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) for extensive code samples - Explore practical implementations across various use cases - Access community-contributed examples and solutions --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\deployment-x\index.md --- title: Cross-Platform .NET SDK Deployment Guide description: Learn how to deploy .NET applications across Windows, macOS, iOS, Android, and Linux. Step-by-step instructions for handling native libraries, platform dependencies, and UI framework integration for multimedia applications. sidebar_label: Deployment order: 17 --- # Cross-Platform Deployment Guide for VisioForge .NET SDK [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to VisioForge SDK Deployment The VisioForge SDK suite provides powerful multimedia capabilities for .NET applications, supporting video capture, editing, playback, and advanced media processing across multiple platforms. Proper deployment is critical to ensure your applications function correctly and leverage the full potential of these SDKs. This comprehensive guide outlines the deployment process for applications built with VisioForge's cross-platform .NET SDKs, helping you navigate the specific requirements of each supported operating system. ## Deployment Overview Deploying applications built with VisioForge SDKs requires careful consideration of platform-specific dependencies and configurations. The deployment process varies significantly depending on your target platform due to differences in: - Native library requirements - Media framework dependencies - Hardware access mechanisms - Package distribution methods ### Key Deployment Considerations Before beginning the deployment process, consider these critical factors: 1. **Target Platform Architecture**: Ensure you select the appropriate architecture (x86, x64, ARM64) for your deployment platform 2. **Required Dependencies**: Some platforms require additional libraries that aren't included in NuGet packages 3. **Framework Compatibility**: Verify compatibility between your .NET version and the target operating system 4. **Native Library Integration**: Understand how native libraries are integrated and loaded on each platform 5. **UI Framework Selection**: Choose the appropriate UI integration package for your selected framework ## Platform-Specific Deployment ### Windows Deployment Windows deployment is the most straightforward, with comprehensive NuGet package support covering all dependencies: - **Package Distribution**: All components available via NuGet - **Architecture Support**: Both x86 and x64 architectures fully supported - **Native Libraries**: Automatically deployed alongside your application - **UI Framework Options**: Windows Forms, WPF, WinUI, Avalonia, and MAUI supported For detailed Windows deployment instructions, see the [Windows deployment guide](Windows.md). ### Android Deployment Android deployment requires specific configuration for native library extraction and permissions: - **Package Distribution**: Core components available via NuGet - **Architecture Support**: ARM64, ARMv7, and x86_64 architectures supported - **Native Libraries**: Requires proper configuration for extraction to the correct location - **Permissions**: Camera, microphone, and storage permissions must be explicitly requested - **UI Integration**: Android-specific video view controls required Android applications use a single native library that must be correctly deployed. Review the [Android deployment guide](Android.md) for complete instructions. ### macOS Deployment macOS deployment requires additional GStreamer library installation: - **Package Distribution**: Core components available via NuGet, GStreamer requires manual installation - **Architecture Support**: Intel (x64) and Apple Silicon (ARM64) architectures supported - **Native Libraries**: Multiple unmanaged libraries required - **Framework Options**: Native macOS, MAUI, and Avalonia supported - **Bundle Integration**: Special attention needed for proper app bundle structure macOS deployments may require specific entitlements and permissions configurations. See the [macOS deployment guide](macOS.md) for detailed instructions. ### iOS Deployment iOS deployment involves unique challenges related to Apple's platform restrictions: - **Package Distribution**: Core components available via NuGet - **Architecture Support**: ARM64 architecture supported - **App Store Guidelines**: Special considerations for App Store submissions - **Native Libraries**: Single unmanaged binary library to deploy - **UI Integration**: iOS-specific video view controls required iOS applications require proper provisioning profiles and entitlements. The [iOS deployment guide](iOS.md) provides comprehensive instructions. ### Ubuntu/Linux Deployment Linux deployment requires manual installation of GStreamer dependencies: - **Package Distribution**: Core components available via NuGet, GStreamer requires system packages - **Architecture Support**: x64 architecture primarily supported - **System Dependencies**: Required packages must be installed on the target system - **Distribution Considerations**: Different Linux distributions may require different dependency packages - **UI Options**: Primarily Avalonia UI framework supported Linux deployment often involves distribution-specific package management. The [Ubuntu deployment guide](Ubuntu.md) provides instructions for Ubuntu-based distributions. ### Runtime Requirements Target devices must meet these minimum requirements: - **Windows**: Windows 7 or later (x86 or x64) - **macOS**: macOS 10.15 (Catalina) or later (x64 or ARM64) - **iOS**: iOS 14.0 or later (ARM64) - **Android**: Android 7.0 (API level 24) or later - **Linux**: Ubuntu 20.04 LTS or later (x64 or ARM64) ## Common Deployment Challenges ### Native Library Loading Issues One of the most common deployment problems involves native library loading failures: - **Symptoms**: Runtime exceptions mentioning DllNotFoundException or similar - **Causes**: Incorrect architecture, missing dependencies, or improper extraction - **Solutions**: Verify package references, check deployment configuration, ensure libraries are in the correct location ### Permission and Security Constraints Modern operating systems enforce strict security policies: - **Camera Access**: Requires explicit permission on all mobile platforms - **Storage Access**: File system restrictions vary by platform - **Network Usage**: May require specific entitlements or manifest entries - **Background Operation**: Platform-specific rules for background media processing ### Performance Considerations Media processing can be resource-intensive: - **CPU Usage**: Implement appropriate threading to avoid UI freezing - **Memory Management**: Monitor and optimize memory usage for large media files - **Power Consumption**: Balance quality settings with battery life considerations ## Deployment Checklist Use this checklist to ensure a successful deployment: - ✅ Correct NuGet packages selected for target platform and architecture - ✅ Platform-specific dependencies installed and configured - ✅ SDK properly initialized and cleaned up - ✅ Appropriate video view controls integrated - ✅ Required permissions requested and justified - ✅ Application tested on target platform under realistic conditions - ✅ Performance metrics validated for acceptable user experience - ✅ Error handling implemented for graceful recovery ## Computer Vision Deployment Computer Vision SDK is a separate NuGet package. Check the [Computer Vision deployment guide](computer-vision.md) for more information. ## Additional Resources - [VisioForge GitHub Repository](https://github.com/visioforge/.Net-SDK-s-samples) - Code samples and example projects - [API Documentation](https://api.visioforge.org/dotnet/) - Comprehensive API reference - [Support Portal](https://support.visioforge.com/) - Technical support and knowledge base ---END OF PAGE--- # Local File: .\dotnet\deployment-x\iOS.md --- title: iOS Cross-Platform .NET App Deployment Guide description: Step-by-step guide for .NET developers on deploying cross-platform applications to iOS devices. Learn about required permissions, SDK integration, architecture support, and best practices for successful iOS app deployment. sidebar_label: iOS --- # Apple iOS Deployment Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Overview This comprehensive guide walks you through the process of deploying VisioForge SDK-powered applications to Apple iOS devices. The VisioForge SDK provides a powerful framework for building media-rich applications on iOS, offering robust support for video capture, editing, playback, and processing capabilities. The iOS deployment process involves several key considerations, from package management to permission handling and performance optimization. This document will guide you through each step to ensure a smooth deployment experience. ## System Requirements Before beginning your iOS deployment process, ensure your development environment meets the following requirements: ### Hardware Requirements - Apple Mac computer for development (required for iOS app signing) - iOS device for testing (strongly recommended over simulators) - Sufficient storage space for development tools and application assets ### Software Requirements - Apple iOS device running iOS 12 or later (latest version recommended) - Xcode 12 or later with iOS SDK installed - Apple Developer account (required for app signing and distribution) - Visual Studio for Mac, JetBrains Rider, or Visual Studio Code - .Net 7.0 SDK or later (we recommend the latest stable version) ## Architecture Support The VisioForge SDK for iOS provides native support for both major iOS device architectures: ### ARM64 Support - Compatible with all modern iOS devices (iPhone X and newer) - Optimized native libraries for maximum performance - Hardware-accelerated video processing where supported by the device ## Installation Process Follow these steps to properly set up and deploy your VisioForge-powered iOS application: 1. Install the .Net SDK for iOS development 2. Create a new iOS project in your preferred IDE (Visual Studio for Mac or JetBrains Rider recommended) 3. Add the required NuGet packages to your project (detailed in the next section) 4. Configure the necessary permissions and entitlements in your app's Info.plist file 5. Implement your application logic using the VisioForge SDK components 6. Build, sign, and deploy your application to test devices ## NuGet Packages The VisioForge SDK for iOS is distributed through NuGet packages: ### Core Packages - [VisioForge.Core](https://www.nuget.org/packages/VisioForge.DotNet.Core) - Core package containing core classes and UI controls, including video playback and display components. This is platform-independent and can be used in any .Net project. ### UI Packages Each UI package has the same VideoView controls but different implementations for the target platform: #### .Net iOS target platform - [VisioForge.Core](https://www.nuget.org/packages/VisioForge.DotNet.Core) - Contains UI controls and all core classes for the iOS platform. #### .Net MAUI target platform - [VisioForge.Core.UI.MAUI](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.MAUI) - Contains UI controls for the MAUI platform. ### Redist Packages - [VisioForge.CrossPlatform.Core.iOS](https://www.nuget.org/packages/VisioForge.CrossPlatform.Core.iOS) - Contains the core redistribution components required for any iOS application using VisioForge technologies. You can add these packages using the NuGet Package Manager in your IDE or by adding the following to your project file (use the latest versions): ```xml ``` Note: Replace version numbers with the latest available releases. ## Required Permissions and Entitlements iOS applications require explicit permissions for accessing device features like cameras, microphones, and the photo library. Configure these permissions in your app's Info.plist file: ### Camera Access Required for video capture functionality: ```xml NSCameraUsageDescription This app requires camera access for video recording ``` ### Microphone Access Required for audio recording: ```xml NSMicrophoneUsageDescription This app requires microphone access for audio recording ``` ### Photo Library Access Required for saving videos to the device's photo library: ```xml NSPhotoLibraryUsageDescription This app requires access to the photo library to save videos ``` ### Example Info.plist Configuration Here's a complete example of an Info.plist file with all necessary permissions: ```xml LSRequiresIPhoneOS UIDeviceFamily 1 2 UIRequiredDeviceCapabilities arm64 UISupportedInterfaceOrientations UIInterfaceOrientationPortrait UIInterfaceOrientationLandscapeLeft UIInterfaceOrientationLandscapeRight UISupportedInterfaceOrientations~ipad UIInterfaceOrientationPortrait UIInterfaceOrientationPortraitUpsideDown UIInterfaceOrientationLandscapeLeft UIInterfaceOrientationLandscapeRight XSAppIconAssets Assets.xcassets/appicon.appiconset NSCameraUsageDescription Camera access is required for video recording NSMicrophoneUsageDescription Microphone access is required for audio recording NSPhotoLibraryUsageDescription Photo library access is required to save videos ``` ## Runtime Permission Handling In addition to declaring permissions in your Info.plist file, you should also request permissions at runtime. Here's an example of how to request camera and microphone permissions: ```csharp using System.Diagnostics; using Photos; // Request camera permission private async Task RequestCameraPermissionAsync() { var status = await Permissions.RequestAsync(); if (status != PermissionStatus.Granted) { // Handle permission denial Debug.WriteLine("Camera permission denied"); } } // Request microphone permission private async Task RequestMicrophonePermissionAsync() { var status = await Permissions.RequestAsync(); if (status != PermissionStatus.Granted) { // Handle permission denial Debug.WriteLine("Microphone permission denied"); } } // Request photo library permission (iOS specific) private void RequestPhotoLibraryPermission() { PHPhotoLibrary.RequestAuthorization(status => { if (status == PHAuthorizationStatus.Authorized) { Debug.WriteLine("Photo library access granted"); } else { Debug.WriteLine("Photo library access denied"); } }); } ``` ## SDK Initialization Properly initialize the VisioForge SDK in your application's lifecycle: ```csharp // In your AppDelegate or application startup code public override bool FinishedLaunching(UIApplication app, NSDictionary options) { // Initialize the VisioForge SDK VisioForge.Core.VisioForgeX.InitSDK(); // Your other initialization code return true; } // Clean up on application termination public override void WillTerminate(UIApplication application) { // Clean up VisioForge SDK resources VisioForge.Core.VisioForgeX.DestroySDK(); // Your other cleanup code } ``` ## Implementation Best Practices ### Using VideoView Controls The VisioForge SDK provides a `VideoView` control for displaying video content. The VideoView is a UIView subclass, and OpenGL is used for video rendering: ```csharp // Create a VideoView instance var videoView = new VisioForge.Core.UI.Apple.VideoView(new CGRect(0, 0, UIScreen.MainScreen.Bounds.Width, UIScreen.MainScreen.Bounds.Height)); View.AddSubview(videoView); // Get the IVideoView interface for use with VisioForge components IVideoView vv = videoView.GetVideoView(); // Use the IVideoView with a VisioForge component var captureCore = new VideoCaptureCoreX(vv); ``` You can add the VideoView using a storyboard or code. ### Resource Management iOS devices have limited resources compared to desktop computers. Follow these best practices: 1. Release resources when not in use 2. Use lower resolution settings for real-time processing 3. Implement proper lifecycle management in your ViewControllers 4. Test on actual devices, not just simulators ## Testing and Debugging ### Physical Device Testing While the iOS simulator can be useful for basic interface testing, it has significant limitations for media applications: - Simulator may have performance issues during video encoding at high resolutions - Camera and microphone are not available in the simulator - Hardware acceleration features may not be available or may behave differently **Always test your media application on physical iOS devices before release.** ### Common Performance Considerations When deploying media applications to iOS, consider these performance factors: 1. **Resolution and frame rate:** Lower these settings for better performance on older devices 2. **Encoder selection:** Use hardware-accelerated encoders when available 3. **Memory management:** Implement proper disposal of large objects and monitor memory usage 4. **Battery impact:** Media processing is power-intensive; implement power-saving measures ## Troubleshooting Common Issues ### Permission Denials If your app can't access the camera or microphone: 1. Verify all required permissions are in your Info.plist 2. Check that you're requesting permissions at runtime before attempting to use the hardware 3. Test if the user has manually denied permissions in iOS Settings ### Library Loading Errors If you encounter errors loading native libraries: 1. Verify all required NuGet packages are properly installed 2. Check for conflicting package versions 3. Ensure you're targeting the correct iOS architecture (ARM64) ## Additional Resources - Visit the [VisioForge GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) for code samples and example projects - Browse the [VisioForge API documentation](https://api.visioforge.org/dotnet/api/index.html) for comprehensive SDK reference --- By following this deployment guide, you should be able to successfully create, configure, and deploy VisioForge-powered applications to iOS devices. For specific questions or advanced configuration needs, please contact VisioForge technical support. ---END OF PAGE--- # Local File: .\dotnet\deployment-x\macOS.md --- title: Cross-platform .NET Development Guide for macOS description: Step-by-step guide for developers on deploying .NET SDKs in macOS environments. Covers native app development, architecture support, package deployment, and troubleshooting for both Intel and Apple Silicon platforms. sidebar_label: macOS --- # Apple macOS Deployment Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction VisioForge's powerful .NET SDKs provide comprehensive media processing capabilities for macOS developers. Whether you're building video capture applications, media players, video editors, or complex media processing pipelines, our SDKs offer the tools you need to deliver high-quality solutions on Apple's platforms. The VisioForge SDK provides comprehensive support for macOS application development using .NET technologies. You can leverage this SDK to build robust media processing applications that run natively on macOS, including support for both Intel (x64) and Apple Silicon (ARM64) architectures. This guide covers everything you need to know to set up, configure, and deploy applications for macOS and MacCatalyst environments using the VisioForge SDK. Whether you're building traditional macOS applications or cross-platform solutions using frameworks like MAUI or Avalonia, this document will help you navigate the installation and deployment process. ## System Requirements Before starting the installation and deployment process, ensure your development environment meets the following requirements: ### Hardware Requirements - Mac computer with Intel processor (x64) or Apple Silicon (ARM64) - Minimum 8GB RAM (16GB recommended for video processing) - Sufficient disk space for development tools and application assets ### Software Requirements - macOS 10.15 (Catalina) or later (latest version recommended) - macOS Monterey (12.x) - macOS Ventura (13.x) - macOS Sonoma (14.x) - Future macOS releases (with ongoing updates) - Xcode 12 or later with Command Line Tools installed - .NET 6.0 SDK or later - Visual Studio for Mac or JetBrains Rider (recommended IDEs) To install XCode Command Line Tools, run the following in Terminal: ```bash xcode-select --install ``` ## Architecture Support The VisioForge SDK for macOS supports both major processor architectures: ### Intel (x64) Support - Compatible with all Intel-based Mac computers - Uses native x64 libraries for optimal performance - Full feature support across all SDK components ### Apple Silicon (ARM64) Support - Native support for M1, M2, and newer Apple Silicon chips - Optimized ARM64 native libraries for maximum performance - Hardware acceleration leveraging Apple's Neural Engine where applicable ### Universal Binary Considerations When targeting both architectures, consider building universal binaries that include both x64 and ARM64 code. This approach ensures your application runs natively on either platform without relying on Rosetta 2 translation. For universal binary builds targeting both Intel and Apple Silicon: ```xml osx-x64;osx-arm64 true ``` ## Core Technologies VisioForge .NET SDKs leverage several key technologies to deliver high-performance media capabilities on macOS: ### GStreamer Integration All VisioForge SDKs utilize GStreamer as the underlying framework for video/audio playback and encoding. GStreamer provides: - Hardware-accelerated media processing - Broad format compatibility - Optimized playback pipeline - Efficient encoding capabilities The GStreamer components are automatically installed through our redistributable packages, eliminating the need for manual configuration. ## Installation and NuGet Package Deployment The primary method for deploying VisioForge SDK components to macOS applications is through NuGet packages. These packages include all necessary managed and unmanaged libraries required for your application. ### Essential NuGet Packages For native macOS applications, add these core packages: 1. **Main SDK Package** (based on your needs): - `VisioForge.DotNet.VideoCapture` for camera capture applications - `VisioForge.DotNet.VideoEdit` for video editing applications - `VisioForge.DotNet.MediaPlayer` for media playback applications - `VisioForge.DotNet.MediaBlocks` for advanced media processing pipelines 2. **UI Package**: - `VisioForge.DotNet.Core` includes Apple-specific UI controls 3. **Platform Redistributable**: - `VisioForge.CrossPlatform.Core.macOS` for native libraries and dependencies ### macOS Applications For standard macOS applications targeting the `netX.0-macos` framework (where X represents the .NET version), use the following NuGet package: - [VisioForge.CrossPlatform.Core.macOS](https://www.nuget.org/packages/VisioForge.CrossPlatform.Core.macOS) This package contains: - Native libraries for media processing - GStreamer components for media playback and encoding - Interface assemblies for .NET integration - Both x64 and ARM64 binaries ### Getting Started with Native macOS Projects To begin developing native macOS applications with VisioForge SDKs: 1. **Create a new macOS project** in your preferred IDE (Visual Studio for Mac or JetBrains Rider) 2. **Add required NuGet packages** (as detailed above) 3. **Configure project settings** for your target architecture ## MacCatalyst and MAUI Applications ### Cross-Platform Development with .NET MAUI .NET Multi-platform App UI (MAUI) enables developing applications that run seamlessly across macOS, iOS, Android, and Windows from a single codebase. VisioForge provides comprehensive support for MAUI development through specialized packages and controls. For MacCatalyst applications (including MAUI projects) targeting the `netX.0-maccatalyst` framework, use: - [VisioForge.CrossPlatform.Core.macCatalyst](https://www.nuget.org/packages/VisioForge.CrossPlatform.Core.macCatalyst) ### MAUI Package Configuration For MAUI projects targeting macOS (through MacCatalyst), add these packages: ```xml ``` ### MAUI Project Setup 1. **Initialize SDK in MauiProgram.cs**: ```csharp builder .UseMauiApp() .UseSkiaSharp() .ConfigureMauiHandlers(handlers => handlers.AddVisioForgeHandlers()); ``` 2. **Add VideoView Control in XAML**: ```xml xmlns:vf="clr-namespace:VisioForge.Core.UI.MAUI;assembly=VisioForge.Core.UI.MAUI" ``` MacCatalyst applications require additional configuration to ensure native libraries are properly included in the application bundle. Add the following custom build target to your project file: ```xml $(OutputPath)$(AssemblyName).app $(AppBundleDir)/Contents/MonoBundle ``` This target performs several crucial tasks: 1. Identifies the application bundle directory 2. Creates the MonoBundle directory if it doesn't exist 3. Copies all `.dylib` and `.so` native libraries to the MonoBundle directory 4. Outputs diagnostic information for troubleshooting For complete MAUI integration details, see our dedicated [MAUI](../install/maui.md) documentation page. ## UI Framework Options The VisioForge SDK supports multiple UI frameworks for macOS development: ### Native macOS UI For traditional macOS applications, the SDK provides `VideoViewGL` controls that integrate with the native AppKit framework. These controls provide high-performance video rendering using OpenGL. ### MAUI For cross-platform MAUI applications, use the [VisioForge.DotNet.Core.UI.MAUI](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.MAUI) package, which provides MAUI-compatible video views. ### Avalonia For Avalonia UI applications, the [VisioForge.DotNet.Core.UI.Avalonia](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.Avalonia) package offers Avalonia-compatible video controls. ## Development Environment Setup ### JetBrains Rider Integration JetBrains Rider provides an excellent development experience for macOS and iOS applications using VisioForge SDKs: 1. Create a new project in Rider targeting macOS or iOS 2. Add the required NuGet packages through the Package Manager 3. Configure project settings for your target platform 4. Add UI controls and implement SDK functionality For detailed Rider setup instructions, see our [Rider integration guide](../install/rider.md). ### Visual Studio for Mac Setup Despite its deprecation, Visual Studio for Mac still works for developing macOS and iOS applications with VisioForge SDKs: 1. Create a new project in Visual Studio for Mac 2. Add NuGet packages through the NuGet Package Manager 3. Configure necessary build settings 4. Add UI controls to your application's interface For detailed Visual Studio for Mac instructions, see our [Visual Studio for Mac guide](../install/visual-studio-mac.md). ## SDK Initialization and Cleanup X-engines in the VisioForge SDK require explicit initialization and cleanup to manage resources properly: ```csharp // Initialize SDK at application startup VisioForge.Core.VisioForgeX.InitSDK(); // Use SDK components... // Clean up resources before application exit VisioForge.Core.VisioForgeX.DestroySDK(); ``` For asynchronous initialization and cleanup, use the async variants: ```csharp // Async initialization await VisioForge.Core.VisioForgeX.InitSDKAsync(); // Async cleanup await VisioForge.Core.VisioForgeX.DestroySDKAsync(); ``` ## Troubleshooting Common Issues ### Native Library Loading Failures If your application fails to load native libraries: 1. Verify all required NuGet packages are properly installed 2. Check the application bundle structure to ensure libraries are in the correct location 3. Use the `dtruss` or `otool` commands to diagnose library loading issues 4. Ensure XCode Command Line Tools are properly installed ### MacCatalyst-Specific Issues For MacCatalyst deployment problems: 1. Verify the CopyNativeLibrariesToMonoBundle target is correctly implemented 2. Check that the MonoBundle directory contains all necessary native libraries 3. Ensure the application has appropriate entitlements for media access ### Performance Optimization For optimal performance: 1. Enable hardware acceleration when available 2. Adjust video resolution based on device capabilities 3. Close and dispose of SDK objects when no longer needed ## Additional Resources For code samples, example projects, and more technical resources: - Visit the [VisioForge GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) for code samples - Join the VisioForge developer community for support and discussions Our samples repository contains comprehensive examples showing: - Video capture from cameras - Media playback implementations - Video editing workflows - Advanced media processing pipelines ## Conclusion VisioForge .NET SDKs provide powerful media capabilities for macOS and iOS developers, enabling the creation of sophisticated multimedia applications. By following this installation and deployment guide, you've established the foundation for building high-performance media applications across Apple's platforms. For any additional questions or support needs, please contact our technical support team or visit our forums for community assistance. --- *This documentation is regularly updated to reflect the latest SDK features and compatibility information.* ---END OF PAGE--- # Local File: .\dotnet\deployment-x\Ubuntu.md --- title: .NET Cross-Platform Deployment Guide for Ubuntu description: Step-by-step guide for deploying .NET multimedia applications on Ubuntu Linux. Learn how to set up dependencies, configure hardware, and optimize performance for cross-platform development. Includes GStreamer setup and troubleshooting tips. sidebar_label: Ubuntu --- # Ubuntu Deployment Guide for VisioForge SDK Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction Deploying .NET applications with VisioForge SDKs on Ubuntu Linux offers multiple benefits, including cross-platform compatibility, access to Linux-specific hardware, and the ability to run your multimedia applications on environments ranging from server infrastructure to edge devices. This comprehensive guide will walk you through the complete process of configuring your Ubuntu environment, installing the necessary dependencies, and deploying your VisioForge-powered .NET application. The VisioForge SDK family works on Ubuntu and other Linux distributions that support `GStreamer` libraries. Additional supported platforms include `Nvidia Jetson` devices and `Raspberry Pi`, making it perfect for a wide range of applications from desktop multimedia software to IoT solutions. ## System Requirements Before deploying your application, ensure your Ubuntu environment meets these minimum requirements: - Ubuntu 20.04 LTS or later (22.04 LTS and later recommended) - .NET 7.0 or later runtime - At least 4GB RAM (8GB recommended for video processing) - x86_64 or ARM64 architecture - Internet connection for package installation ## Installation and Setup ### Installing .NET Download the latest [.NET installer](https://dotnet.microsoft.com/download/dotnet) package from the Microsoft website and follow the installation instructions. ## GStreamer Installation GStreamer forms the multimedia backbone for VisioForge SDKs on Linux platforms. It provides essential functionality for audio and video capture, processing, and playback. ### Required GStreamer Packages Install the following GStreamer packages using apt-get. We require v1.22.0 or later, though v1.24.0+ is highly recommended for access to the latest features and optimizations: - `gstreamer1.0-plugins-base`: Essential baseline plugins - `gstreamer1.0-plugins-good`: High-quality, well-tested plugins - `gstreamer1.0-plugins-bad`: Newer plugins of varying quality - `gstreamer1.0-alsa`: ALSA audio support - `gstreamer1.0-gl`: OpenGL rendering support - `gstreamer1.0-pulseaudio`: PulseAudio integration - `libges-1.0-0`: GStreamer Editing Services - `gstreamer1.0-libav`: FFMPEG integration (OPTIONAL but recommended for broader format support) ### Complete Installation Script The following commands will update your package repositories and install all required GStreamer components: ```bash sudo apt update ``` ```bash sudo apt install gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-pulseaudio gstreamer1.0-libav libges-1.0-0 ``` ### Raspberry Pi Additional Requirements For Raspberry Pi, additionally, you need to install the following packages: ```bash sudo apt install gstreamer1.0-libcamera ``` ### Verifying GStreamer Installation After installation, verify your GStreamer setup by running: ```bash gst-inspect-1.0 --version ``` This should display the installed GStreamer version. Ensure it meets the minimum requirement (1.22.0+) or ideally shows 1.24.0 or later. ## Required NuGet Packages When deploying your .NET application to Ubuntu, you'll need to include additional platform-specific NuGet packages that provide the necessary native libraries and bindings. ### Additional Core Linux Package The [VisioForge.CrossPlatform.Core.Linux.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.Core.Linux.x64) package contains essential native libraries and bindings for the .NET Linux platform. This package is mandatory for all VisioForge SDK deployments on Ubuntu. ### Development Environment You can use Rider to develop your project in Linux. Please check the [Rider](../install/rider.md) installation page for more information. ## Application Deployment Follow these steps to deploy your application on Ubuntu: ### Publishing Your Application To create a self-contained deployment that includes all .NET runtime dependencies: ```bash dotnet publish -c Release -r linux-x64 --self-contained true ``` For smaller deployments where the target machine already has .NET installed: ```bash dotnet publish -c Release -r linux-x64 --self-contained false ``` ### Deployment Structure Your deployment folder should contain: - Your application executable - Application DLLs - VisioForge SDK assemblies - Native Linux libraries from the VisioForge NuGet packages ### Setting Execution Permissions Ensure your application executable has the proper permissions: ```bash chmod +x ./YourApplicationName ``` ## Hardware Considerations ### Camera Support Ubuntu supports various camera types: - **USB Webcams**: Most USB webcams work out of the box - **IP Cameras**: Supported via RTSP, HTTP streams - **Professional Cameras**: Many professional cameras with Linux drivers are supported - **Virtual Devices**: v4l2loopback can be used for virtual camera creation To list available camera devices: ```bash v4l2-ctl --list-devices ``` ### Audio Devices Audio capture and playback is supported through: - ALSA (Advanced Linux Sound Architecture) - PulseAudio To list available audio devices: ```bash arecord -L # For recording devices aplay -L # For playback devices ``` ## Troubleshooting ### Permission Issues Camera or audio device access issues can often be resolved by adding your user to the appropriate groups: ```bash sudo usermod -a -G video,audio $USER ``` Remember to log out and back in for group changes to take effect. ### Performance Optimization For optimal performance on Ubuntu: - Use the latest GStreamer version (1.24.0+) - Enable hardware acceleration where available - For NVIDIA GPUs, install the appropriate CUDA and nvcodec packages - Adjust process priority using `nice` for resource-intensive applications ## Conclusion Deploying VisioForge SDK applications on Ubuntu provides a powerful, flexible environment for multimedia applications. By following this guide, you can ensure that your .NET application leverages the full capabilities of the VisioForge SDK ecosystem on Linux platforms. For specific deployment scenarios or troubleshooting assistance, refer to the comprehensive documentation available on the VisioForge website or contact our technical support team. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\deployment-x\Windows.md --- title: Cross-platform SDK .Net deployment for Windows description: Comprehensive guide for installing and deploying VisioForge SDK for .Net applications on Windows. Learn how to set up development environments, manage dependencies, and troubleshoot common issues for multimedia applications. sidebar_label: Windows --- # Windows Installation and Deployment Guide for VisioForge Cross-Platform SDK [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to VisioForge SDK Installation and Deployment The VisioForge SDK suite provides powerful multimedia capabilities for your .NET applications, supporting video capture, editing, playback, and advanced media processing across multiple platforms. This comprehensive guide covers both installation and deployment for Windows applications. ## Installation SDKs are accessible in two forms: a setup file and NuGet packages. The setup file provides a straightforward installation process, ensuring that all necessary components are correctly configured. On the other hand, NuGet packages offer a flexible and modular approach to incorporating SDKs into your projects, allowing for easy updates and dependency management. We highly recommend utilizing NuGet packages due to their convenience and efficiency in managing project dependencies and updates. When building your application, you have the option to create both x86 and x64 versions. This allows your application to run on a wider range of systems, accommodating different hardware architectures. However, it's important to note that the setup file is exclusively available for the x64 architecture. This means that while you can develop and compile x86 builds, the initial setup and installation process will require an x64 system. ### IDEs For development, you can use powerful integrated development environments (IDEs) like JetBrains Rider or Visual Studio. Both IDEs offer robust tools and features to streamline the development process on Windows. To ensure a smooth setup, please refer to the respective installation guides. The [Rider installation page](../install/rider.md) provides detailed instructions for setting up JetBrains Rider, while the [Visual Studio installation page](../install/visual-studio.md) offers comprehensive guidance on installing and configuring Visual Studio. These resources will help you get started quickly and effectively, leveraging the full capabilities of these development environments. ## Distribution and Package Management VisioForge SDK components for Windows are distributed as NuGet packages, making integration straightforward with modern .NET development environments. You can add these packages to your project using any of the following tools: - Visual Studio Package Manager - JetBrains Rider NuGet Manager - Visual Studio Code with NuGet extensions - Direct command-line integration using the .NET CLI ## Required Base Packages Every Windows application built with VisioForge SDK requires the appropriate base package according to your application's target architecture. These packages contain the essential components for SDK functionality. ### Core Platform Packages - [VisioForge.CrossPlatform.Core.Windows.x86](https://www.nuget.org/packages/VisioForge.CrossPlatform.Core.Windows.x86) - For 32-bit Windows applications - [VisioForge.CrossPlatform.Core.Windows.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.Core.Windows.x64) - For 64-bit Windows applications > **Note**: For applications targeting multiple architectures, you should include both packages and implement appropriate runtime selection logic. ## Optional Component Packages Depending on your application's requirements, you may need to include additional packages for specialized functionality. These optional components extend the SDK's capabilities in various domains. ### FFMPEG Media Processing (Recommended) These packages provide comprehensive codec support for a wide range of media formats through the FFMPEG library integration: - [VisioForge.CrossPlatform.Libav.Windows.x86](https://www.nuget.org/packages/VisioForge.CrossPlatform.Libav.Windows.x86) - 32-bit FFMPEG support - [VisioForge.CrossPlatform.Libav.Windows.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.Libav.Windows.x64) - 64-bit FFMPEG support For applications with size constraints, compressed versions of these packages utilizing UPX compression are available: - [VisioForge.CrossPlatform.Libav.Windows.x86.UPX](https://www.nuget.org/packages/VisioForge.CrossPlatform.Libav.Windows.x86.UPX) - Compressed 32-bit FFMPEG support - [VisioForge.CrossPlatform.Libav.Windows.x64.UPX](https://www.nuget.org/packages/VisioForge.CrossPlatform.Libav.Windows.x64.UPX) - Compressed 64-bit FFMPEG support ### Cloud Integration - Amazon Web Services For applications requiring cloud storage integration with AWS S3: - [VisioForge.CrossPlatform.AWS.Windows.x86](https://www.nuget.org/packages/VisioForge.CrossPlatform.AWS.Windows.x86) - 32-bit AWS support - [VisioForge.CrossPlatform.AWS.Windows.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.AWS.Windows.x64) - 64-bit AWS support When using these packages, the following Media Blocks become available: - `AWSS3SourceBlock` - For retrieving media from S3 buckets - `AWSS3SinkBlock` - For storing media in S3 buckets ### Computer Vision with OpenCV For applications requiring advanced image processing and computer vision capabilities: - [VisioForge.CrossPlatform.OpenCV.Windows.x86](https://www.nuget.org/packages/VisioForge.CrossPlatform.OpenCV.Windows.x86) - 32-bit OpenCV support - [VisioForge.CrossPlatform.OpenCV.Windows.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.OpenCV.Windows.x64) - 64-bit OpenCV support The OpenCV integration provides access to Media Blocks in the `VisioForge.Core.MediaBlocks.OpenCV` namespace, including: - Image transformation: `CVDewarpBlock`, `CVDilateBlock`, `CVErodeBlock` - Edge and feature detection: `CVEdgeDetectBlock`, `CVLaplaceBlock`, `CVSobelBlock` - Face processing: `CVFaceBlurBlock`, `CVFaceDetectBlock` - Motion detection: `CVMotionCellsBlock` - Object recognition: `CVTemplateMatchBlock`, `CVHandDetectBlock` - Image enhancement: `CVEqualizeHistogramBlock`, `CVSmoothBlock` - Tracking and overlay: `CVTrackerBlock`, `CVTextOverlayBlock` ## Specialized Hardware Support Packages VisioForge SDK provides integration with professional camera systems and specialized hardware. Include the appropriate package when working with specific device types. ### Allied Vision Cameras For integrating with professional Allied Vision camera hardware: - [VisioForge.CrossPlatform.AlliedVision.Windows.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.AlliedVision.Windows.x64) ### Basler Cameras For applications working with Basler industrial cameras: - [VisioForge.CrossPlatform.Basler.Windows.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.Basler.Windows.x64) ### Teledyne/FLIR Cameras (Spinnaker SDK) For thermal imaging and specialized FLIR cameras: - [VisioForge.CrossPlatform.Spinnaker.Windows.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.Spinnaker.Windows.x64) ### GenICam Protocol Support (GigE/USB3 Vision) For cameras utilizing the standardized GenICam protocol: - [VisioForge.CrossPlatform.GenICam.Windows.x64](https://www.nuget.org/packages/VisioForge.CrossPlatform.GenICam.Windows.x64) ## Deployment Best Practices When deploying VisioForge-based applications for Windows, consider these recommendations: 1. Choose the appropriate architecture packages (x86 or x64) based on your target platform 2. Include the FFMPEG packages for comprehensive media format support 3. Only include specialized hardware packages when needed to minimize deployment size 4. For security-sensitive applications, consider using the UPX compressed versions to obfuscate native libraries 5. Always test your deployment on a clean system to ensure all dependencies are properly resolved ## Troubleshooting Common Issues ### Deployment Issues If you encounter issues after deployment: 1. Verify all required NuGet packages are properly included 2. Check that the architecture (x86/x64) matches your application target 3. Ensure native libraries are being extracted to the correct locations 4. Review Windows security and permission settings that might restrict media functionality ### WinForms RESX Files Build Issue Sometimes you can get the following error: `Error MSB3821: Couldn't process file Form1.resx due to its being in the Internet or Restricted zone or having the mark of the web on the file. Remove the mark of the web if you want to process these files.` Error MSB3821 occurs when Visual Studio or MSBuild cannot process a `.resx` resource file because it is marked as untrusted. This happens when the file has the "Mark of the Web" (MOTW), a security feature that flags files downloaded from the internet or received from untrusted sources. The MOTW places the file in the Internet or Restricted security zone, preventing it from being processed during a build. #### How to Fix It To resolve this error, you need to remove the MOTW from the affected file: ##### Unblock the File Manually - Right-click on Form1.resx in File Explorer. - Select Properties. - In the General tab, check for an Unblock button or checkbox at the bottom. - Click Unblock, then click OK. ##### Unblock via PowerShell (for multiple files) - Open PowerShell. - Navigate to your project directory. - Run the command: Get-ChildItem -Path . -Recurse | Unblock-File ##### Unblock the ZIP Before Extraction - If you downloaded the project as a ZIP file, right-click the ZIP file. - Select Properties. - Click Unblock, then extract the files. By unblocking the file, you remove the MOTW, allowing Visual Studio to process it normally during the build. For additional assistance, visit the [VisioForge support site](https://support.visioforge.com/) or consult the [API documentation](https://api.visioforge.org/dotnet/api/index.html). --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\index.md --- title: .Net SDKs - Info, Manuals & Usage Guides description: Discover comprehensive info, manuals & guides for VisioForge .Net SDKs (Video Capture, Media Player, Video Edit). Build powerful .NET multimedia apps. sidebar_label: General Information order: 18 --- # VisioForge .Net SDKs: Information, Manuals, and Usage This section provides essential information, detailed software manuals, and practical usage guides for the suite of VisioForge .Net SDKs. Whether you're working with video capture, media playback, or video editing, find the resources you need below. - [Code samples](code-samples/index.md) - [How to send logs?](sendlogs.md) ## Guides - [Video capture to MPEG-TS in VisioForge SDKs](guides/video-capture-to-mpegts.md) ## SDK Components Explore the core components of VisioForge .Net SDKs: ### Media Processing & Effects - **[Video Effects & Processing](video-effects/index.md)**: Enhance your applications with powerful video effects, overlays, and processing capabilities. Learn how to implement professional-grade visual effects, text/image overlays, and custom video processing. - **[Audio Effects](audio-effects/audio-sample-grabber.md)**: Explore options for applying various audio effects and enhancements within your .NET applications. ### Encoding & Formats - **[Video Encoders](video-encoders/index.md)**: Detailed overview of video encoders (H.264, HEVC, AV1, etc.) - features, performance, and implementation for .NET developers. - **[Audio Encoders](audio-encoders/index.md)**: Master audio encoding (AAC, FLAC, MP3, Opus) with guidance on optimal settings, performance tips, and best practices. - **[Output Formats](output-formats/index.md)**: Learn about video and audio container formats (MP4, WebM, AVI, MKV) including examples, codec comparisons, and compatibility. ### Streaming & Connectivity - **[Network Streaming](network-streaming/index.md)**: Implement RTMP, RTSP, HLS, and NDI streaming in .NET. Includes examples for live broadcasting, hardware acceleration, and platform integration. ---END OF PAGE--- # Local File: .\dotnet\general\sendlogs.md --- title: Troubleshooting with Logs for .NET SDK Products description: Learn how to enable, capture and share debug logs for effective troubleshooting and issue resolution in .NET SDK applications. This comprehensive guide includes step-by-step instructions for both demo and production environments. sidebar_label: Sending Logs --- # Troubleshooting with Logs for .NET SDK Products [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Why Logs Matter in SDK Troubleshooting When developing applications that utilize media SDKs, you may encounter technical issues that require detailed investigation. Debug logs provide critical information that helps identify the root cause of problems quickly and efficiently. These logs capture everything from initialization sequences to detailed operation steps, error conditions, and system information. Properly collected logs offer several key benefits: - **Faster Issue Resolution**: Technical support can quickly identify the source of problems - **Complete Context**: Logs provide a full picture of what happened before, during, and after an issue - **System Information**: Details about your environment help reproduce and solve problems - **Development Insights**: Understanding logs can help you optimize your implementation ## Log Collection in Demo Applications Our demo applications include built-in debugging capabilities that make it easy to collect logs for troubleshooting. Follow these steps to enable and share logs: ### Step-by-Step Guide for Demo Application Logging 1. **Launch the Demo Application** - Open the relevant demo application for your SDK - Locate the main interface where settings can be configured 2. **Enable Debug Mode** - Find and check the "Debug" checkbox in the application interface - This activates detailed logging of all SDK operations 3. **Reproduce the Issue** - Configure any other required settings for your specific scenario - Press the Start or Play button (depending on which SDK you're using) - Allow the application to run until the issue occurs - After sufficient time to capture the problem, press the Stop button 4. **Collect Log Files** - Navigate to "My Documents\VisioForge" on your system - This folder contains all generated log files - **Important**: Exclude any audio/video recordings from your collection to reduce file size 5. **Share Logs Securely** - Compress the log files into a ZIP archive - Upload to a secure file sharing service like Dropbox, Google Drive, or OneDrive - Share the access link with technical support ## Implementing Logging in Your Custom Applications When you're developing your own applications with our SDKs, you'll need to explicitly enable and configure logging. This section explains how to implement logging with different SDK components. ### Enabling Debug Logs in Your Code Regardless of which SDK you're using, the basic approach to enabling logs follows a similar pattern: ```csharp // Example for MediaPlayer SDK mediaPlayer.Debug_Mode = true; mediaPlayer.Debug_Dir = "C:\\Logs\\MyApplication"; // Example for Video Capture SDK videoCapture.Debug_Mode = true; videoCapture.Debug_Dir = "C:\\Logs\\MyApplication"; // Example for Video Edit SDK videoEdit.Debug_Mode = true; videoEdit.Debug_Dir = "C:\\Logs\\MyApplication"; ``` ### Detailed Implementation Guide 1. **Set Debug Mode Property** - For any SDK component you're using, set the `Debug_Mode` property to `true` - This must be done before calling initialization or playback methods - Example: `MediaPlayer1.Debug_Mode = true;` 2. **Specify Log Directory** - Set the `Debug_Dir` property to a valid directory path - Ensure the specified directory exists and your application has write permissions - Example: `MediaPlayer1.Debug_Dir = "C:\\LogFiles\\MyApp";` 3. **Configure Additional Parameters** - Set up any other required parameters for your specific use case - These could include video sources, codecs, output settings, etc. 4. **Initialize and Run the Component** - Call the appropriate method to start the component (e.g., `Start()` or `Play()`) - Let the application run until you've reproduced the issue you're troubleshooting 5. **Collect and Share Logs** - Locate the log files in both your specified directory and "My Documents\VisioForge" - Compress all log files into a ZIP archive - Share via secure file sharing service ## Advanced Logging Techniques For more complex applications or difficult-to-reproduce issues, consider these advanced logging approaches: ### Conditional Debug Activation You might want to enable debug logging only in certain scenarios or based on user actions: ```csharp // Enable debug mode only when troubleshooting if (troubleshootingMode) { mediaPlayer.Debug_Mode = true; mediaPlayer.Debug_Dir = Path.Combine( Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "AppLogs" ); } ``` ### Environment-Specific Logging Different deployment environments may require different logging approaches: ```csharp #if DEBUG // Development environment logging videoCapture.Debug_Mode = true; videoCapture.Debug_Dir = Path.Combine( Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "DevLogs" ); #else // Production environment logging (if permitted by your privacy policy) string appDataPath = Path.Combine( Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData), "YourCompany", "YourApp", "Logs" ); Directory.CreateDirectory(appDataPath); videoCapture.Debug_Mode = true; videoCapture.Debug_Dir = appDataPath; #endif ``` ## Best Practices for Effective Logging To ensure you get the most valuable information from your logs, follow these best practices: ### 1. Clear Initial State Before starting a logging session, consider resetting your application state: - Close and restart the application - Clear any cached data if relevant - Ensure you're capturing from a known starting point ### 2. Capture Complete Sessions When possible, capture the entire session from start to finish: - Enable logging before initializing SDK components - Let logging run through the entire operation - Continue logging until after the issue occurs ### 3. Document Reproduction Steps Along with your logs, provide clear steps to reproduce the issue: - Note specific settings used - Document the exact sequence of operations - Include timing information if relevant (e.g., "crash occurs after 30 seconds of playback") ### 4. Manage Log Size Debug logs can grow large, especially for long sessions: - For extended tests, consider breaking logging into multiple sessions - Focus on capturing just the problematic scenario - Always exclude large media files when sharing logs ### 5. Secure Sensitive Information Before sharing logs, be aware of potential sensitive data: - Review logs for any personal or sensitive information - Consider using sanitized test content when possible - Use secure methods to transfer log files ## Interpreting Common Log Messages While advanced log analysis is best left to technical support, understanding some common log patterns can help you identify issues: - **Initialization Errors**: Look for messages containing "Init" or "Initialize" - **Format Issues**: Watch for "format" or "codec" related messages - **Resource Problems**: Messages about "memory", "handles", or "resources" - **Performance Warnings**: Notes about "frame drops", "processing time", or "buffers" ## Conclusion Proper logging is essential for efficient troubleshooting of SDK-based applications. By following the guidelines in this document, you can provide the detailed information needed to quickly resolve any issues you encounter. Remember that detailed logs significantly reduce resolution time and help improve the quality of both your application and our SDKs. For additional code samples and implementation guides, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\audio-effects\audio-sample-grabber.md --- title: Working with Audio Sample Grabber in .NET SDKs description: Learn how to capture and process audio frames in real-time using the Audio Sample Grabber functionality across Video Capture, Media Player, and Video Edit .NET SDKs. Complete tutorial with code examples for both X-engines and Classic engines. sidebar_label: Audio Sample Grabber Usage --- # Working with Audio Sample Grabber in .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Audio Sample Grabber The Audio Sample Grabber is a powerful feature available across our .NET SDKs that enables developers to access raw audio frames directly from both live sources and media files. This capability opens up a wide range of possibilities for audio processing, analysis, and manipulation in your applications. When working with audio processing, gaining access to individual audio frames is essential for tasks such as: - Real-time audio visualization - Custom audio effects processing - Speech recognition integration - Audio analysis and metrics - Custom audio format conversion - Sound detection algorithms The `OnAudioFrameBuffer` event is the core mechanism that provides access to these raw audio frames. This event fires each time a new audio frame is available, giving you direct access to unmanaged memory containing the decoded audio data. ## How Audio Sample Grabber Works The Audio Sample Grabber intercepts the audio pipeline during playback or capture, providing you with the raw audio data before it's rendered to the output device. This data is typically in PCM (Pulse Code Modulation) format, which is the standard format for uncompressed digital audio, but can occasionally be in IEEE floating-point format depending on the audio source. Each time the `OnAudioFrameBuffer` event fires, it provides an `AudioFrameBufferEventArgs` object containing critical information about the audio frame: - `Frame.Data`: An `IntPtr` pointing to the unmanaged memory block containing the raw audio data - `Frame.DataSize`: The size of the audio data in bytes - `Frame.Info`: A structure containing detailed information about the audio format, including: - Channel count (mono, stereo, etc.) - Sample rate (typically 44.1kHz, 48kHz, etc.) - Bits per sample (16-bit, 24-bit, etc.) - Audio format type (PCM, IEEE, etc.) - Timestamp information - Block alignment and other format-specific details ## Setting Up Audio Sample Grabber The setup process varies slightly depending on whether you're using our newer X-engines or the Classic engines. Let's explore both approaches: +++ X-engines For X-engines, setting up the Audio Sample Grabber is straightforward. You simply need to create an event handler for the `OnAudioFrameBuffer` event: ```csharp VideoCapture1.OnAudioFrameBuffer += OnAudioFrameBuffer; ``` The X-engines architecture automatically enables audio sample grabbing when you subscribe to this event, with no additional configuration required. +++ Classic engines When using Classic engines, you need to explicitly enable the Audio Sample Grabber functionality before creating the event handler: ```csharp VideoCapture1.Audio_Sample_Grabber_Enabled = true; ``` Then, as with X-engines, create your event handler: ```csharp VideoCapture1.OnAudioFrameBuffer += OnAudioFrameBuffer; ``` **Note**: The `Audio_Sample_Grabber_Enabled` property is not required for the VideoEditCore component, which has audio sample grabbing enabled by default. +++ Media Blocks SDK The Media Blocks SDK also supports audio sample grabbing. Use the `AudioSampleGrabberBlock` component to capture audio frames. ```csharp private AudioSampleGrabberBlock _audioSampleGrabberSink; ``` Then, as with X-engines, create your event handler, and specify the audio format: ```csharp _audioSampleGrabberBlock = new AudioSampleGrabberBlock(VisioForge.Core.Types.X.AudioFormatX.S16); _audioSampleGrabberBlock.OnAudioSampleGrabber += OnAudioFrameBuffer; ``` +++ ## Processing Audio Frames Once you've set up the event handler, you can process the audio frames as they arrive. Here's a basic example of how to handle the `OnAudioFrameBuffer` event: ```csharp using VisioForge.Types; using System.Diagnostics; private void OnAudioFrameBuffer(object sender, AudioFrameBufferEventArgs e) { // Log audio frame information Debug.WriteLine($"Audio frame: {e.Frame.DataSize} bytes; Format: {e.Frame.Info}"); // Access to raw audio data through the unmanaged pointer IntPtr rawAudioData = e.Frame.Data; // Get audio format details int channelCount = e.Frame.Info.ChannelCount; int sampleRate = e.Frame.Info.SampleRate; int bitsPerSample = e.Frame.Info.BitsPerSample; // Your custom audio processing code here // ... } ``` ## Working with Audio Data ### Converting Unmanaged Memory to Managed Arrays While the `e.Frame.Data` provides a pointer to unmanaged memory, you often need to work with the data in a more convenient form. The `AudioFrame` class provides a helpful `GetDataArray()` method that returns a copy of the audio data as a byte array: ```csharp private void VideoCapture1_OnAudioFrameBuffer(object sender, AudioFrameBufferEventArgs e) { // Get a managed copy of the audio data byte[] audioData = e.Frame.GetDataArray(); // Now you can work with the data using standard C# array operations // ... } ``` ### Converting PCM Data to Samples For many audio processing tasks, you'll want to convert the raw PCM bytes into actual audio sample values. Here's a helper method to convert a PCM byte array to an array of audio samples (assuming 16-bit samples): ```csharp private short[] ConvertBytesToSamples(byte[] audioData) { short[] samples = new short[audioData.Length / 2]; for (int i = 0; i < samples.Length; i++) { // Combine two bytes into one 16-bit sample samples[i] = (short)(audioData[i * 2] | (audioData[i * 2 + 1] << 8)); } return samples; } ``` ### Handling Multi-Channel Audio When working with stereo or multi-channel audio, the samples are typically interleaved. For a stereo stream, the data is arranged as: [Left0, Right0, Left1, Right1, ...]. You may want to separate these channels for processing: ```csharp private void ProcessStereoAudio(short[] samples, int channelCount) { if (channelCount != 2) return; // Create arrays for each channel int samplesPerChannel = samples.Length / 2; short[] leftChannel = new short[samplesPerChannel]; short[] rightChannel = new short[samplesPerChannel]; // Separate the channels for (int i = 0; i < samplesPerChannel; i++) { leftChannel[i] = samples[i * 2]; rightChannel[i] = samples[i * 2 + 1]; } // Process each channel separately // ... } ``` ## Common Audio Processing Scenarios ### Audio Level Metering A common use case for the Audio Sample Grabber is to implement audio level metering: ```csharp private void CalculateAudioLevel(short[] samples) { double sum = 0; // Calculate RMS (Root Mean Square) value foreach (short sample in samples) { sum += sample * sample; } double rms = Math.Sqrt(sum / samples.Length); // Convert to decibels double db = 20 * Math.Log10(rms / 32768); // Update UI with the level (you'll need to invoke if on a different thread) Debug.WriteLine($"Audio level: {db} dB"); } ``` ### Real-time FFT for Spectrum Analysis For frequency spectrum analysis, you might want to perform an FFT (Fast Fourier Transform) on the audio data: ```csharp // Note: You'll need a library for FFT calculation // This is a simplified example private void PerformFFTAnalysis(short[] samples) { // Typically you would use a library like Math.NET Numerics // Convert samples to complex numbers Complex[] complex = samples.Select(s => new Complex(s, 0)).ToArray(); // Perform FFT (pseudocode) // Complex[] fftResult = FFT.Forward(complex); // Process FFT results // ... } ``` ## Performance Considerations When working with the Audio Sample Grabber, keep these performance considerations in mind: 1. **Minimize Processing Time**: The `OnAudioFrameBuffer` event is called on the audio processing thread. Long-running operations can cause audio glitches. 2. **Consider Thread Safety**: If you need to update UI elements or interact with other components, use proper thread synchronization methods. 3. **Avoid Memory Allocations**: Frequent memory allocations in the event handler can lead to garbage collection pauses. Reuse arrays where possible. 4. **Buffer Copying**: The `GetDataArray()` method creates a copy of the audio data. For very high-performance scenarios, consider working directly with the unmanaged pointer. ## Conclusion The Audio Sample Grabber provides a powerful way to access and process raw audio data in real-time from both live sources and media files. By leveraging this functionality, you can implement sophisticated audio processing features in your applications, from simple level metering to complex audio analysis and effects processing. Whether you're building a professional audio application, implementing audio visualization, or integrating with speech recognition services, the Audio Sample Grabber gives you the raw data you need to bring your audio processing ideas to life. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\aac.md --- title: AAC Audio Encoder Implementation Guide description: Learn how to implement AAC audio encoding in .NET applications with multiple encoder types, bitrate configurations, and cross-platform support. Includes code examples and best practices for developers. sidebar_label: AAC (M4A) --- # AAC encoder and M4A output [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The VisioForge SDK provides several AAC encoder implementations, each with unique characteristics and use cases. ## What is M4A Output? M4A is a file format used for storing audio data encoded with the Advanced Audio Coding (AAC) codec. VisioForge .Net SDKs provide robust support for creating high-quality M4A audio files through their dedicated M4AOutput class. This format is widely used for digital audio distribution due to its excellent compression efficiency and sound quality. ## Cross-platform M4A (AAC) output [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The cross-platform capable SDKs (VideoCaptureCoreX, VideoEditCoreX, MediaBlocksPipeline) allow you to utilize several AAC encoder implementations via `M4AOutput`. This guide focuses on three main approaches using dedicated settings objects: 1. [AVENC AAC Encoder](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.AudioEncoders.AVENCAACEncoderSettings.html) - A feature-rich, cross-platform encoder. 2. [VO-AAC Encoder](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.AudioEncoders.VOAACEncoderSettings.html) - A streamlined, cross-platform encoder. 3. Media Foundation AAC Encoder - A Windows-specific system encoder, accessible on Windows platforms via `MFAACEncoderSettings`. ### AVENC AAC Encoder The AVENC AAC Encoder offers the most comprehensive configuration options for audio encoding. It provides advanced settings for stereo coding, prediction, and noise shaping. #### Key Features - Multiple coder strategies - Configurable stereo coding - Advanced noise and prediction techniques #### Coder Strategies The AVENC AAC Encoder supports three coder strategies: - `ANMR`: Advanced noise modeling and reduction method - `TwoLoop`: Two-loop searching method for optimization - `Fast`: Default fast search algorithm (recommended for most use cases) #### Sample Configuration ```csharp var aacSettings = new AVENCAACEncoderSettings { Coder = AVENCAACEncoderCoder.Fast, Bitrate = 192, IntensityStereo = true, ForceMS = true, TNS = true }; ``` #### Supported Parameters - **Bitrates**: 0, 32, 64, 96, 128, 160, 192, 224, 256, 320 kbps - **Sample Rates**: 7350 to 96000 Hz - **Channels**: 1 to 6 channels ### VO-AAC Encoder The VO-AAC Encoder is a more streamlined encoder with simpler configuration options. #### Key Features - Simplified configuration - Straightforward bitrate and sample rate controls - Limited to stereo audio #### Sample Configuration ```csharp var aacSettings = new VOAACEncoderSettings { Bitrate = 128 }; ``` #### Supported Parameters - **Bitrates**: 32, 64, 96, 128, 160, 192, 224, 256, 320 kbps - **Sample Rates**: 8000 to 96000 Hz - **Channels**: 1-2 channels ### Media Foundation AAC Encoder (Windows Only) This encoder is specific to Windows platforms and offers a limited but performance-optimized encoding solution. #### Key Features - Windows-specific implementation - Predefined bitrate options - Limited sample rate support #### Supported Parameters - **Bitrates**: 0 (Auto), 96, 128, 160, 192, 576, 768, 960, 1152 kbps - **Sample Rates**: 44100, 48000 Hz - **Channels**: 1, 2, 6 channels ### Encoder Availability and Selection Each encoder provides a static `IsAvailable()` method to check if the encoder can be used in the current environment. This is useful for runtime compatibility checks. ```csharp if (AVENCAACEncoderSettings.IsAvailable()) { // Use AVENC AAC Encoder } else if (VOAACEncoderSettings.IsAvailable()) { // Fallback to VO-AAC Encoder } ``` ### Getting Started with M4AOutput The cross-platform implementation uses the [M4AOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.M4AOutput.html) class as the foundation for M4A file creation. To begin using this feature, initialize the class with your desired output filename: ```csharp var output = new M4AOutput("output.m4a"); ``` ### Switching Between Encoders The default encoder selection is platform-dependent: - Windows environments: MF AAC - Other platforms: VO-AAC You can override this default selection by explicitly setting the `Audio` property: ```csharp // For VO-AAC encoder output.Audio = new VOAACEncoderSettings(); // For AVENC AAC encoder output.Audio = new AVENCAACEncoderSettings(); // For MF AAC encoder (Windows only) #if NET_WINDOWS output.Audio = new MFAACEncoderSettings(); #endif ``` ### Configuring MP4 Sink Settings Since M4A files are based on the MP4 container format, you can adjust various output parameters through the `Sink` property: ```csharp // Change the output filename output.Sink.Filename = "new_output.m4a"; ``` ### Advanced Audio Processing For workflows requiring specialized audio processing, the M4AOutput class supports custom audio processors: ```csharp // Implement your custom audio processing logic output.CustomAudioProcessor = new MyCustomAudioProcessor(); ``` ### Key Methods for File Management The M4AOutput class provides several methods for handling files and retrieving encoder information: ```csharp // Get current output filename string currentFile = output.GetFilename(); // Update the output filename output.SetFilename("updated_file.m4a"); // Retrieve available audio encoders var audioEncoders = output.GetAudioEncoders(); ``` ### Using M4A Output in Different SDKs Each VisioForge SDK has a slightly different approach to implementing M4A output: #### With Video Capture SDK ```csharp var core = new VideoCaptureCoreX(); core.Outputs_Add(output, true); ``` #### With Video Edit SDK ```csharp var core = new VideoEditCoreX(); core.Output_Format = output; ``` #### With Media Blocks SDK ```csharp var aac = new VOAACEncoderSettings(); var sinkSettings = new MP4SinkSettings("output.m4a"); var m4aOutput = new M4AOutputBlock(sinkSettings, aac); ``` ### Rate Control Considerations 1. **AVENC AAC Encoder**: - Most flexible rate control - Supports constant bitrate (CBR) - Multiple encoding strategies affect quality and performance 2. **VO-AAC Encoder**: - Simple constant bitrate control - Recommend for straightforward encoding needs - Limited advanced configuration 3. **Media Foundation Encoder**: - Limited to predefined bitrates - Good for quick Windows-based encoding - Auto bitrate option available ### Recommendations - For advanced audio encoding with maximum control, use AVENC AAC Encoder - For simple, cross-platform encoding, use VO-AAC Encoder - For Windows-specific, optimized encoding, use Media Foundation Encoder ### Performance and Quality Considerations - **Bitrate vs. Quality vs. File Size**: Higher bitrates generally result in better audio quality but also lead to larger file sizes. Experiment with different bitrates to find the optimal balance for your specific content and distribution needs. - **Sample Rate Matching**: Always try to choose sample rates that match your source audio. This avoids unnecessary resampling, which can potentially degrade audio quality. - **Encoder Characteristics**: - `AVENC AAC Encoder`: Offers the most extensive configuration options, allowing for fine-grained control over quality and performance. Ideal for advanced use cases. - `VO-AAC Encoder`: Provides a good balance of simplicity, cross-platform compatibility, and quality. A solid choice for many common scenarios. - `Media Foundation AAC Encoder`: Leverages built-in Windows audio processing capabilities. It can be efficient on Windows but offers less configuration flexibility than AVENC. - **Channel Configuration (Mono vs. Stereo)**: - For voice-only content, using mono encoding (1 channel) can significantly reduce file size without a noticeable loss in quality for speech. Check if your chosen encoder settings (e.g., `AVENCAACEncoderSettings.Channels`) allow explicit channel configuration. - For music and rich audio environments, stereo (2 channels) is generally preferred. - **Content-Specific Bitrate Ranges**: While higher is often better, the "best" bitrate depends on the audio content: - *Speech/Voice:* 64-96 kbps can be adequate. - *General Music:* 128-192 kbps is a common target for good quality. - *High-Fidelity Audio:* 256-320 kbps or higher might be used when pristine quality is critical. These are guidelines; always test with your specific audio. - **Target Audience and Platform**: Consider who will be listening and on what devices. For example, if the audio is primarily for web streaming to mobile devices, extremely high bitrates might lead to buffering issues or unnecessary data consumption. Tailor your encoder choice and settings accordingly. ### Sample Code - Check the [MP4 output](../output-formats/mp4.md) guide for sample code. - Check the [AAC encoder block](../../mediablocks/AudioEncoders/index.md) for sample code. ## Windows-only AAC output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] [M4AOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.M4AOutput.html) is the primary class for configuring M4A (AAC) output settings. It implements both `IVideoEditBaseOutput` and `IVideoCaptureBaseOutput` interfaces. ### Properties | Property | Type | Description | Default Value | |----------|------|-------------|---------------| | Version | AACVersion | Specifies the AAC version (MPEG-2 or MPEG-4) | MPEG4 | | Object | AACObject | Defines the AAC object type | Low | | Output | AACOutput | Sets the AAC output mode | RAW | | Bitrate | int | Specifies the AAC bitrate in kbps | 128 | ### Methods #### `GetInternalTypeVC()` - Returns: `VideoCaptureOutputFormat.M4A` - Purpose: Gets the internal output format for video capture #### `GetInternalTypeVE()` - Returns: `VideoEditOutputFormat.M4A` - Purpose: Gets the internal output format for video editing #### `Save()` - Returns: JSON string representation of the M4AOutput object - Purpose: Serializes the current configuration to JSON #### `Load(string json)` - Parameters: JSON string containing M4AOutput configuration - Returns: New M4AOutput instance - Purpose: Creates a new M4AOutput instance from JSON configuration ### Supporting Enums #### AACVersion Defines the version of AAC to be used: | Value | Description | |-------|-------------| | MPEG4 | MPEG-4 AAC (default) | | MPEG2 | MPEG-2 AAC | #### AACObject Specifies the AAC encoder stream object type: | Value | Description | |-------|-------------| | Undefined | Not to be used | | Main | Main profile | | Low | Low Complexity profile (default) | | SSR | Scalable Sample Rate profile | | LTP | Long Term Prediction profile | #### AACOutput Determines the AAC encoder stream output type: | Value | Description | |-------|-------------| | RAW | Raw AAC stream (default) | | ADTS | Audio Data Transport Stream format | ### Usage Example ```csharp // Create new M4A output configuration var core = new VideoCaptureCore(); core.Mode = VideoCaptureMode.VideoCapture; core.Output_Filename = "output.m4a"; var output = new VisioForge.Core.Types.Output.M4AOutput { Bitrate = 192, Version = AACVersion.MPEG4, Object = AACObject.Low, Output = AACOutput.ADTS }; core.Output_Format = output; // core is an instance of VideoCaptureCore or VideoEditCore ``` ### Selecting the Right Bitrate The optimal bitrate depends on your content type and quality requirements: - **64-96 kbps**: Suitable for voice recordings and speech content - **128-192 kbps**: Recommended for general music and audio content - **256-320 kbps**: Ideal for high-fidelity music where quality is paramount ### Choosing the Appropriate Profile - Use `AACObject.Low` for most applications as it provides an excellent balance between quality and encoding efficiency - Reserve `AACObject.Main` for specialized use cases requiring maximum quality - Avoid `AACObject.Undefined` as it isn't a valid encoding option ### Container Format Selection - `AACOutput.ADTS` provides better compatibility with various players and devices - `AACOutput.RAW` is preferable when the AAC stream will be embedded within another container format ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\flac.md --- title: FLAC Audio Encoder Integration Guide description: Learn how to implement FLAC lossless audio compression in .NET applications. Configure quality settings, optimize performance, and handle advanced compression parameters for high-quality audio processing. sidebar_label: FLAC --- # FLAC encoder and output [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The FLAC (Free Lossless Audio Codec) encoder provides high-quality lossless audio compression while preserving the original audio quality. ## Cross-platform FLAC output [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ### Features The FLAC encoder supports a wide range of audio configurations: - Sample rates from 1 Hz to 655,350 Hz - Up to 8 audio channels (mono to 7.1 surround) - Lossless compression with adjustable quality settings - Streamable output support - Configurable block sizes and compression parameters ### Quality Settings The encoder provides a quality parameter ranging from 0 to 9: - 0: Fastest compression (lowest CPU usage) - 1-7: Balanced compression settings - 8: Highest compression (higher CPU usage) - 9: Insane compression (extremely CPU intensive) The default quality setting is 5, which offers a good balance between compression ratio and processing speed. ### Basic Settings The cross-platform [FLACEncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.AudioEncoders.FLACEncoderSettings.html) class offers advanced configuration options: ```csharp // Create FLAC encoder settings with default quality var flacSettings = new FLACEncoderSettings { // Default compression level Quality = 5, // Audio block size in samples BlockSize = 4608, // Enable streaming support StreamableSubset = true, // Enable stereo processing MidSideStereo = true }; ``` ### Advanced Compression Settings ```csharp // Create FLAC encoder settings with advanced configuration var advancedSettings = new FLACEncoderSettings { // Linear Prediction settings // Maximum LPC order for prediction MaxLPCOrder = 8, // Auto precision for coefficients QlpCoeffPrecision = 0, // Residual coding settings MinResidualPartitionOrder = 3, MaxResidualPartitionOrder = 3, // Search optimization settings // Disable expensive coefficient search ExhaustiveModelSearch = false, // Disable precision search QlpCoeffPrecSearch = false, // Disable escape code search EscapeCoding = false }; ``` ### Sample Code Add the FLAC output to the Video Capture SDK core instance: ```csharp // Create a Video Capture SDK core instance var core = new VideoCaptureCoreX(); // Create a FLAC output instance var flacOutput = new FLACOutput("output.flac"); // Set the quality of the FLAC encoder flacOutput.Audio.Quality = 5; // Add the FLAC output core.Outputs_Add(flacOutput, true); ``` Set the output format for the Video Edit SDK core instance: ```csharp // Create a Video Edit SDK core instance var core = new VideoEditCoreX(); // Create a FLAC output instance var flacOutput = new FLACOutput("output.flac"); // Set the quality flacOutput.Audio.Quality = 5; // Set the output format core.Output_Format = flacOutput; ``` Create a Media Blocks FLAC output instance: ```csharp // Create a FLAC encoder settings instance var flacSettings = new FLACEncoderSettings(); // Create a FLAC output instance var flacOutput = new FLACOutputBlock("output.flac", flacSettings); ``` ### FLACOutput class The `FLACOutput` class provides functionality for configuring FLAC (Free Lossless Audio Codec) output in the VisioForge SDKs. ```csharp // Create a new FLAC output instance var flacOutput = new FLACOutput("output.flac"); // Configure FLAC encoder settings flacOutput.Audio.CompressionLevel = 5; // Example setting ``` #### Filename - Set the output filename during initialization or using the property - Can also be accessed/modified using `GetFilename()` and `SetFilename()` methods ```csharp // Set during initialization var flacOutput = new FLACOutput("audio_output.flac"); ``` ```csharp // Or using the property flacOutput.Filename = "new_output.flac"; ``` #### Audio Settings The `Audio` property provides access to FLAC-specific encoding settings through the `FLACEncoderSettings` class: ```csharp flacOutput.Audio = new FLACEncoderSettings(); // Configure specific FLAC encoding parameters here ``` #### Custom Audio Processing You can set a custom audio processor using the `CustomAudioProcessor` property: ```csharp flacOutput.CustomAudioProcessor = new CustomMediaBlock(); ``` #### Implementation Notes - The class implements multiple interfaces: - `IVideoEditXBaseOutput` - `IVideoCaptureXBaseOutput` - `IOutputAudioProcessor` - Only FLAC audio encoding is supported (no video encoding capabilities) - Default FLAC encoder settings are automatically created during initialization Media Blocks SDK contains a dedicated [FLAC encoder block](../../mediablocks/AudioEncoders/index.md). ### Performance Considerations When configuring the FLAC encoder, consider these performance factors: 1. Higher quality settings (7-9) will significantly increase CPU usage 2. The `ExhaustiveModelSearch` option can greatly impact encoding speed 3. Larger block sizes may improve compression but increase memory usage 4. `StreamableSubset` should remain enabled unless you have specific requirements ### Compatibility The encoder supports the following configurations: - Audio channels: 1 to 8 channels - Sample rates: 1 Hz to 655,350 Hz - Bitrate: Variable (lossless compression) ### Error Handling Always check for encoder availability before use: ```csharp if (!FLACEncoderSettings.IsAvailable()) { // Handle unavailable encoder scenario Console.WriteLine("FLAC encoder is not available on this system"); return; } ``` ### Best Practices 1. Start with the default quality setting (5) and adjust based on your needs 2. Enable `MidSideStereo` for stereo content to improve compression 3. Use `SeekPoints` for longer audio files to enable quick seeking 4. Keep `StreamableSubset` enabled unless you have specific requirements 5. Avoid using `ExhaustiveModelSearch` unless compression ratio is critical ## Windows-only FLAC output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The [FLACOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.FLACOutput.html) class provides Windows-only settings for the FLAC encoder. This class implements both `IVideoEditBaseOutput` and `IVideoCaptureBaseOutput` interfaces, making it suitable for both video editing and capture scenarios. ### Properties #### Compression Level - **Property**: `Level` - **Type**: `int` - **Range**: 0-8 - **Default**: 5 - **Description**: Controls the compression level, where 0 provides fastest compression and 8 provides highest compression. #### Block Size - **Property**: `BlockSize` - **Type**: `int` - **Default**: 4608 - **Valid Values**: For subset streams, must be one of: - 192, 256, 512, 576, 1024, 1152, 2048, 2304, 4096, 4608 - 8192, 16384 (only if sample rate > 48kHz) - **Description**: Specifies the block size in samples. The encoder uses the same block size for the entire stream. #### LPC Order - **Property**: `LPCOrder` - **Type**: `int` - **Default**: 8 - **Constraints**: - Must be ≤ 32 - For subset streams at ≤ 48kHz, must be ≤ 12 - **Description**: Specifies the maximum Linear Predictive Coding order. Setting to 0 disables generic linear prediction and uses only fixed predictors, which is faster but typically results in 5-10% larger files. #### Mid-Side Coding Options ##### Mid-Side Coding - **Property**: `MidSideCoding` - **Type**: `bool` - **Default**: `false` - **Description**: Enables mid-side coding for stereo streams. This typically increases compression by a few percent by encoding both stereo pair and mid-side versions of each block and selecting the smallest resulting frame. ##### Adaptive Mid-Side Coding - **Property**: `AdaptiveMidSideCoding` - **Type**: `bool` - **Default**: `false` - **Description**: Enables adaptive mid-side coding for stereo streams. This provides faster encoding than full mid-side coding but with slightly less compression by adaptively switching between independent and mid-side coding. #### Rice Parameters ##### Rice Minimum - **Property**: `RiceMin` - **Type**: `int` - **Default**: 3 - **Description**: Sets the minimum residual partition order. Works in conjunction with RiceMax to control how the residual signal is partitioned. ##### Rice Maximum - **Property**: `RiceMax` - **Type**: `int` - **Default**: 3 - **Description**: Sets the maximum residual partition order. The residual is partitioned into 2^min to 2^max pieces, each with its own Rice parameter. Optimal settings typically depend on block size, with best results when blocksize/(2^n)=128. #### Advanced Options ##### Exhaustive Model Search - **Property**: `ExhaustiveModelSearch` - **Type**: `bool` - **Default**: `false` - **Description**: Enables exhaustive model search for optimal encoding. When enabled, the encoder generates subframes for every order and uses the smallest, potentially improving compression by ~0.5% at the cost of significantly increased encoding time. ### Methods #### Constructor ```csharp public FLACOutput() ``` Initializes a new instance with default values: - Level = 5 - RiceMin = 3 - RiceMax = 3 - LPCOrder = 8 - BlockSize = 4608 ### Serialization #### Save() ```csharp public string Save() ``` Serializes the settings to a JSON string. #### Load(string json) ```csharp public static FLACOutput Load(string json) ``` Creates a new FLACOutput instance from a JSON string. ### Usage Example ```csharp var flacSettings = new FLACOutput { Level = 8, // Maximum compression BlockSize = 4608, // Default block size MidSideCoding = true, // Enable mid-side coding for better compression ExhaustiveModelSearch = true // Enable exhaustive search for best compression }; core.Output_Format = flacSettings; // Core is VideoCaptureCore or VideoEditCore ``` ### Best Practices #### Compression Level Selection - Use Level 0-3 for faster encoding with moderate compression - Use Level 4-6 for balanced compression/speed - Use Level 7-8 for maximum compression regardless of speed #### Block Size Considerations - Larger block sizes generally provide better compression - Stick to standard values (4608, 4096, etc.) for maximum compatibility - Consider memory constraints when selecting block size #### Mid-Side Coding - Enable for stereo content when compression is priority - Use adaptive mode when encoding speed is important - Disable for mono content as it has no effect #### Rice Parameters - Default values (3,3) are suitable for most use cases - Increase for potentially better compression at the cost of encoding speed - Values beyond 6 rarely provide significant benefits ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\index.md --- title: Audio Encoder Integration Guide for .NET SDKs description: Master audio encoding in .NET applications with detailed guidance on implementing AAC, FLAC, MP3, Opus, and other encoders. Learn optimal settings, performance tips, and best practices for professional media development. sidebar_label: Audio Encoders order: 20 --- # Audio Encoders for .NET Development [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Audio Encoding in .NET Applications When developing media applications in .NET, choosing the right audio encoder is crucial for ensuring optimal performance, compatibility, and quality. VisioForge's suite of .NET SDKs provides developers with powerful tools for audio encoding across various formats, enabling the creation of professional-grade media applications. Audio encoders are essential components that convert raw audio data into compressed formats suitable for storage, streaming, or playback. Each encoder offers different advantages in terms of compression ratio, audio quality, processing requirements, and platform compatibility. This guide will help you navigate the various audio encoding options available in VisioForge's .NET SDKs. ## Available Audio Encoders VisioForge's .NET SDKs include support for the following audio encoders, each designed for specific use cases: ### [AAC Encoder](aac.md) Advanced Audio Coding (AAC) represents the industry standard for high-quality audio compression. It delivers excellent sound quality at lower bit rates compared to older formats like MP3. **Key features:** - Efficient compression with minimal quality loss - Wide device and platform compatibility - Variable bit rate support for optimized file sizes - Ideal for streaming applications and mobile devices - Support for multi-channel audio (up to 48 channels) AAC is particularly well-suited for applications where audio quality is paramount, such as music streaming services, video production tools, and professional media applications. ### [FLAC Encoder](flac.md) Free Lossless Audio Codec (FLAC) provides lossless compression of audio data, preserving the original audio quality while reducing file size. **Key features:** - Lossless compression with no quality degradation - Open-source format with broad support - Typically reduces file sizes by 40-50% compared to uncompressed audio - Fast encoding and decoding performance - Supports metadata tags and seeking FLAC is ideal for archiving audio, professional audio editing applications, and audiophile-grade music playback systems where maintaining perfect audio fidelity is essential. ### [MP3 Encoder](mp3.md) MPEG Audio Layer III (MP3) remains one of the most widely used audio formats due to its universal compatibility and acceptable quality-to-size ratio. **Key features:** - Nearly universal compatibility across devices and platforms - Configurable bit rates from 8 to 320 Kbps - Joint stereo mode for improved compression efficiency - Variable bit rate (VBR) encoding for optimized quality - Fast encoding and minimal processing requirements MP3 is best for applications where wide compatibility is more important than achieving the absolute highest audio quality, such as podcasts, basic music applications, and legacy system integration. ### [Opus Encoder](opus.md) Opus is a highly versatile audio codec designed to handle both speech and music with excellent quality at low bit rates. **Key features:** - Superior performance at low bit rates (6-64 Kbps) - Low algorithmic delay for real-time applications - Seamless quality adjustment based on available bandwidth - Excellent for both speech and music content - Open standard with growing adoption Opus excels in real-time communication applications, VoIP systems, live streaming, and scenarios where bandwidth efficiency is critical. ### [Speex Encoder](speex.md) Speex is an audio compression format specifically optimized for speech encoding, making it ideal for voice-centric applications. **Key features:** - Designed specifically for human voice compression - Variable bit rates from 2 to 44 Kbps - Voice activity detection and comfort noise generation - Low latency for real-time applications - Open source with minimal patent concerns Speex is particularly effective for voice chat applications, voice recording tools, and telephony systems where speech clarity is the priority. ### [Vorbis Encoder](vorbis.md) Vorbis is an open-source, patent-free audio compression format that offers quality comparable to AAC at similar bit rates. **Key features:** - Free and open format without licensing restrictions - Excellent quality-to-size ratio for music - Variable and average bit rate encoding modes - Strong support in open-source software ecosystems - Multi-channel audio support Vorbis is well-suited for applications where licensing costs are a concern, such as open-source projects, indie game development, and web applications. ### [WavPack Encoder](wavpack.md) WavPack offers a unique hybrid approach to audio compression, providing both lossless and high-quality lossy compression options. **Key features:** - Hybrid mode combining lossy and lossless techniques - Correction files to restore lossy files to lossless quality - Fast decoding with minimal CPU requirements - Support for high-resolution audio up to 32-bit/192kHz - Robust error correction capabilities WavPack is excellent for applications requiring flexible quality options, archival purposes, and systems where decoding performance is more critical than encoding speed. ### [Windows Media Audio Encoder](wma.md) Windows Media Audio (WMA) provides a set of audio codecs developed by Microsoft, offering good integration with Windows platforms. **Key features:** - Native integration with Windows environments - Multiple codec variants (WMA Standard, Pro, Lossless) - Good performance on Windows devices and Xbox platforms - Professional variant supports multi-channel surround sound - Digital rights management capabilities WMA is particularly useful for Windows-centric applications, enterprise solutions, and scenarios where DRM protection is required. ## Choosing the Right Audio Encoder Selecting the appropriate audio encoder depends on several factors: 1. **Quality Requirements**: For archiving or professional applications, consider lossless options like FLAC or WavPack. For general-purpose use, AAC or Vorbis provide excellent quality at reasonable sizes. 2. **Platform Compatibility**: If your application needs to work across many devices, MP3 offers the widest compatibility, while AAC is well-supported on modern platforms. 3. **Content Type**: For speech-focused applications, Speex or Opus at lower bitrates excel. For music, AAC, Vorbis, or MP3 at higher bitrates are preferable. 4. **Bandwidth Considerations**: For streaming over limited connections, Opus provides excellent quality at very low bitrates. 5. **Licensing Requirements**: If your project requires open-source or patent-free solutions, focus on FLAC, Vorbis, or Opus. ## Implementation Considerations When implementing audio encoders in your .NET application: - **Threading**: Consider encoding audio on background threads to prevent UI freezing during processing. - **Buffer Management**: Properly manage audio buffers to prevent memory leaks during encoding operations. - **Error Handling**: Implement robust error handling for encoding failures or corrupt input data. - **Metadata**: Most formats support metadata tags—use them to enhance the user experience. - **Preprocessing**: Consider implementing audio normalization or other preprocessing before encoding for optimal results. ## Performance Optimization To achieve the best performance when using audio encoders: - Match encoding quality to your application's needs—higher quality settings require more processing power - Implement caching strategies for frequently accessed audio - Consider hardware acceleration when available, particularly for real-time encoding - Batch process audio files when possible rather than encoding on demand - Monitor memory usage, especially when processing long audio files ## Getting Started To begin implementing audio encoders in your .NET application using VisioForge SDKs, follow these steps: 1. Install the appropriate VisioForge SDK via NuGet or direct download 2. Reference the SDK in your project 3. Initialize the encoder with your desired configuration settings 4. Process audio through the encoder using the provided API methods 5. Handle the encoded output as needed for your application Each encoder has specific initialization parameters and optimal settings, which are detailed in their respective documentation pages. By understanding the strengths and appropriate use cases for each audio encoder, .NET developers can make informed decisions that optimize their media applications for quality, performance, and compatibility. ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\mp3.md --- title: Record, Capture & Edit MP3 Audio in C# description: Learn how to use C# and the VisioForge .NET SDK to record, capture, and edit MP3 audio. Configure output settings and integrate audio features into your .NET applications with our guideline. sidebar_label: MP3 --- # Mastering MP3 Audio: Record, Capture & Edit in C# and .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The VisioForge SDK empowers developers to seamlessly record, capture, and edit MP3 audio within C# applications. This guide explores how to leverage our robust .NET SDK for high-quality MP3 audio processing. Whether you need to capture media streams, record MP3 files, or edit audio waveforms, our C# media toolkit provides comprehensive tools using the LAME library. MP3, a widely adopted lossy audio compression format, is ideal for audio streaming and efficient storage. You can utilize the MP3 encoder to integrate audio capture and recording functionalities into various container formats such as MP4, AVI, and MKV, enhancing your audio capture projects. Our SDK works seamlessly with Visual Studio for a smooth development experience. SDK contains MP3 audio encoder that can be used to encode audio streams to MP3 format using the LAME library. MP3 is a lossy audio compression format that is widely used in audio streaming and storage. You can use MP3 encode to encode audio in MP4, AVI, MKV, and other containers. ## Cross-platform MP3 Audio Capture and Recording [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The [MP3EncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.AudioEncoders.MP3EncoderSettings.html) class provides developers with a streamlined approach to configure MP3 encoding for C# audio capture projects. This cross-platform solution supports various rate controls and quality settings, making it ideal for record .NET MP3 applications across different operating systems. ### Supported Formats and Specifications for C# MP3 Recording - Input Format: S16LE (Signed 16-bit Little Endian) - Sample Rates: 8000, 11025, 12000, 16000, 22050, 24000, 32000, 44100, 48000 Hz - Channels: Mono (1) or Stereo (2) ### Rate Control Modes The encoder supports three rate control modes: 1. **CBR (Constant Bit Rate)** - Fixed bitrate throughout the entire encoding process - Supported bitrates: 8, 16, 24, 32, 40, 48, 56, 64, 80, 96, 112, 128, 160, 192, 224, 256, 320 Kbit/s - Best for streaming MP3 and when consistent file size is important 2. **ABR (Average Bit Rate)** - Maintains an average bitrate while allowing some variation - More efficient than CBR while still maintaining predictable file sizes - Useful for streaming services that need approximate file size estimates 3. **Quality-based VBR** - Variable Bit Rate based on sound complexity - Quality setting ranges from 0 (best) to 10 - Most efficient for storage and best quality-to-size ratio ### C# MP3 Encoding Examples Create basic MP3 encoder settings with CBR. ```csharp // Create basic MP3 encoder settings using Constant Bit Rate mode var mp3Settings = new MP3EncoderSettings { // Set to Constant Bit Rate - provides consistent file size and streaming reliability RateControl = MP3EncoderRateControl.CBR, // 192 kbps offers good quality for most music content while keeping file size reasonable Bitrate = 192, // Standard quality offers a good balance between encoding speed and output quality EncodingEngineQuality = MP3EncodingQuality.Standard, // Keep stereo channels (false) - set to true if you want to convert to mono ForceMono = false }; ``` Quality-based VBR configuration for high-quality .NET MP3 editing. ```csharp // Configure MP3 encoder with Variable Bit Rate for optimal quality-to-size ratio var vbrSettings = new MP3EncoderSettings { // Quality-based VBR adjusts bitrate dynamically based on audio complexity RateControl = MP3EncoderRateControl.Quality, // Quality scale: 0 (best) to 10 (worst) - 2.0 provides excellent quality with reasonable file size Quality = 2.0f, // High quality encoding uses more CPU but produces better results EncodingEngineQuality = MP3EncodingQuality.High }; ``` Add the MP3 output to capture C# MP3 audio with the Video Capture SDK: The [MP3Output](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.Output.MP3Output.html) class implements multiple interfaces: - IVideoEditXBaseOutput - IVideoCaptureXBaseOutput - IOutputAudioProcessor ```csharp // Create a Video Capture SDK core instance for recording var core = new VideoCaptureCoreX(); // Initialize MP3 output with target filename var mp3Output = new MP3Output("output.mp3"); // Configure audio encoding settings mp3Output.Audio.RateControl = MP3EncoderRateControl.CBR; // Use Constant Bit Rate for reliable streaming mp3Output.Audio.Bitrate = 128; // 128 kbps is suitable for general audio recording // Add the MP3 output to the capture pipeline core.Outputs_Add(mp3Output, true); ``` Set the output format for the Video Edit SDK core instance: ```csharp // Initialize Video Edit SDK for processing existing media var core = new VideoEditCoreX(); // Create MP3 output with target filename var mp3Output = new MP3Output("output.mp3"); // Configure Variable Bit Rate encoding for better quality-to-size ratio mp3Output.Audio.RateControl = MP3EncoderRateControl.Quality; mp3Output.Audio.Quality = 5.0f; // Middle quality setting (0-10 scale) - good balance of quality and size // Set as the primary output format for the editor core.Output_Format = mp3Output; ``` ### Initialization To create a new MP3Output instance, you need to provide the output filename: ```csharp // Initialize MP3 output with destination filename var mp3Output = new MP3Output("output.mp3"); ``` ### Audio Settings The `Audio` property provides access to MP3 encoder settings: ```csharp // Create default MP3 encoder settings object mp3Output.Audio = new MP3EncoderSettings(); // Additional configuration can be applied to mp3Output.Audio properties ``` ### Custom Audio Processing You can set a custom audio processor using the `CustomAudioProcessor` property to handle waveform manipulations: ```csharp // Attach a custom audio processor for advanced audio manipulation mp3Output.CustomAudioProcessor = new MediaBlock(); // The MediaBlock can be configured for effects, filtering, or other audio processing ``` ### Filename Operations There are multiple ways to work with the output filename: ```csharp // Retrieve the current output filename string currentFile = mp3Output.GetFilename(); // Change the output destination mp3Output.SetFilename("newoutput.mp3"); // Alternative way to set the filename via property mp3Output.Filename = "another.mp3"; ``` ### Audio Encoders The MP3Output class supports MP3 encoding exclusively. You can verify the available encoders: ```csharp // Get information about available audio encoders var audioEncoders = mp3Output.GetAudioEncoders(); // Returns a list of tuples containing encoder names and their setting types // For MP3Output, this will contain a single entry for MP3 ``` ### MP3OutputBlock class The [MP3OutputBlock](../../mediablocks/AudioEncoders/index.md) class provides a more flexible way to configure MP3 encoding. Create a Media Blocks MP3 output instance: ```csharp // Create MP3 encoder settings with desired configuration var mp3Settings = new MP3EncoderSettings(); // Initialize MP3 output block with destination file and settings var mp3Output = new MP3OutputBlock("output.mp3", mp3Settings); ``` Check if MP3 encoding is available. ```cs // Check if MP3 encoding is available on the current system if (!MP3EncoderSettings.IsAvailable()) { // Handle case where MP3 encoding is not available // This might occur if LAME or other required libraries are missing } ``` ### Encoding Quality Levels The encoder supports three quality presets that affect the encoding speed and CPU usage: - `Fast`: Quickest encoding, lower CPU usage - `Standard`: Balanced speed and quality (default) - `High`: Best quality, higher CPU usage ### Common Scenarios #### High-Quality Music Capture in C# ```csharp // Configure settings for high-quality music recording var highQualitySettings = new MP3EncoderSettings { // Use quality-based Variable Bit Rate for optimal audio fidelity RateControl = MP3EncoderRateControl.Quality, // Highest quality setting (0.0f) for maximum audio fidelity Quality = 0.0f, // Use high-quality encoding algorithm (more CPU intensive but better results) EncodingEngineQuality = MP3EncodingQuality.High }; ``` #### Streaming Audio ```csharp // Configure settings optimized for audio streaming applications var streamingSettings = new MP3EncoderSettings { // Use Constant Bit Rate for predictable streaming performance RateControl = MP3EncoderRateControl.CBR, // 128 kbps provides good quality for most content while being bandwidth-friendly Bitrate = 128, // Fast encoding reduces CPU usage, important for real-time streaming EncodingEngineQuality = MP3EncodingQuality.Fast }; ``` ## Windows-only MP3 output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The [MP3 file output](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.MP3Output.html) class provides advanced configuration options for MP3 encoding in C# audio video capture and editing scenarios. ### Key Features - Flexible channel mode selection - VBR and CBR encoding support for optimal .NET MP3 recording - Advanced encoding parameters for professional audio applications - Quality control settings for perfect C# MP3 editing results ### Basic Configuration #### CBR_Bitrate Controls the Constant Bit Rate (CBR) setting for MP3 encoding. - For MPEG-1 (32, 44.1, 48 kHz): Valid values are 32, 40, 48, 56, 64, 80, 96, 112, 128, 160, 192, 224, 256, 320 kbps - For MPEG-2 (16, 22.05, 24 kHz): Valid values are 8, 16, 24, 32, 40, 48, 56, 64, 80, 96, 112, 128, 144, 160 kbps - Default values: 128 kbps (MPEG-1) or 64 kbps (MPEG-2) #### SampleRate Specifies the audio sampling frequency in Hz. Common values are: - 44100 Hz (CD quality, default) - 48000 Hz (professional audio) - 32000 Hz (broadcast) - 22050 Hz (lower quality) - 16000 Hz (voice) #### ChannelsMode Determines how audio channels are encoded. Options include: 1. StandardStereo: Independent channel encoding with dynamic bit allocation 2. JointStereo: Exploits correlation between channels using mid/side encoding 3. DualStereo: Independent encoding with fixed 50/50 bit allocation (ideal for dual language) 4. Mono: Single channel output (downmixes stereo input) ### Variable Bit Rate (VBR) Settings #### VBR_Mode Enables Variable Bit Rate encoding when set to true (default). VBR allows the encoder to adjust bitrate based on audio complexity. #### VBR_MinBitrate Sets the minimum allowed bitrate for VBR encoding (default: 96 kbps). #### VBR_MaxBitrate Sets the maximum allowed bitrate for VBR encoding (default: 192 kbps). #### VBR_Quality Controls VBR encoding quality (0-9): - Lower values (0-4): Higher quality, slower encoding - Middle values (5-6): Balanced quality and speed - Higher values (7-9): Lower quality, faster encoding ### Quality and Performance #### EncodingQuality Determines the algorithmic quality of encoding (0-9): - 0-1: Best quality, slowest encoding - 2: Recommended for high quality - 5: Default, good balance of speed and quality - 7: Fast encoding with acceptable quality - 9: Fastest encoding, lowest quality ### Special Features #### ForceMono When enabled, automatically downmixes multi-channel audio to mono. #### VoiceEncodingMode Experimental mode optimized for voice content. #### KeepAllFrequencies Disables automatic frequency filtering, preserving all frequencies at the cost of efficiency. #### DisableShortBlocks Forces use of long blocks only, which may improve quality at very low bitrates but can cause pre-echo artifacts. ### MP3 Frame Flags #### Copyright Sets the copyright bit in MP3 frames. #### Original Marks the stream as original content. #### CRCProtected Enables CRC error detection at the cost of 16 bits per frame. #### EnableXingVBRTag Adds VBR information headers for better player compatibility. #### StrictISOCompliance Enforces strict ISO MP3 standard compliance. ### Example MP3 Recording and Editing Configurations Basic settings for C# MP3 capture applications. ```csharp // Configure basic MP3 output with standard settings var mp3Output = new MP3Output { // 192 kbps provides good quality for most music content CBR_Bitrate = 192, // CD-quality sample rate SampleRate = 44100, // Joint stereo mode provides better compression for most stereo content ChannelsMode = MP3ChannelsMode.JointStereo, }; // Set as the output format for capture or editing core.Output_Format = mp3Output; // Core is VideoCaptureCore or VideoEditCore ``` VBR configuration. ```csharp // Configure MP3 output with Variable Bit Rate for better quality/size balance var mp3Output = new MP3Output { // Enable Variable Bit Rate encoding VBR_Mode = true, // Set minimum bitrate floor to ensure acceptable quality VBR_MinBitrate = 96, // Limit maximum bitrate to control file size VBR_MaxBitrate = 192, // Quality level 6 provides a good balance between quality and file size VBR_Quality = 6, }; // Set as the output format for capture or editing core.Output_Format = mp3Output; // Core is VideoCaptureCore or VideoEditCore ``` #### Basic Stereo MP3 Encoding ```csharp // Configure standard stereo MP3 encoding with fixed bitrate var mp3Output = new MP3Output { // 192 kbps provides good quality for most music while keeping file size reasonable CBR_Bitrate = 192, // Standard stereo mode encodes left and right channels independently ChannelsMode = MP3ChannelsMode.StandardStereo, // CD-quality sample rate SampleRate = 44100, // Disable Variable Bit Rate to ensure consistent file size and playback VBR_Mode = false }; ``` #### Voice-Optimized Encoding ```csharp // Configure MP3 settings optimized for voice recordings var voiceMP3 = new MP3Output { // Enable voice-optimized encoding algorithms VoiceEncodingMode = true, // Use mono for voice to reduce file size (most voice doesn't benefit from stereo) ChannelsMode = MP3ChannelsMode.Mono, // Lower sample rate is sufficient for voice content SampleRate = 22050, // Enable Variable Bit Rate for better quality/size ratio VBR_Mode = true, // Better quality setting for voice clarity while keeping file size reasonable VBR_Quality = 4 }; ``` #### High-Quality Music Encoding ```csharp // Configure high-quality MP3 settings for music archiving var highQualityMP3 = new MP3Output { // Enable Variable Bit Rate for optimal quality-to-size ratio VBR_Mode = true, // Set minimum bitrate to ensure good quality even in simple passages VBR_MinBitrate = 128, // Allow high bitrate for complex passages to preserve audio detail VBR_MaxBitrate = 320, // Use high quality setting (2) for excellent audio fidelity VBR_Quality = 2, // Set encoder algorithm to high quality mode EncodingQuality = 2, // Joint stereo provides better compression for most music content ChannelsMode = MP3ChannelsMode.JointStereo, // Professional audio sample rate captures full audible spectrum SampleRate = 48000, // Add VBR header for better player compatibility and seeking EnableXingVBRTag = true }; ``` ### Advanced Settings - **CRC Protection**: Adds error detection capability at the cost of 16 bits per frame - **Short Blocks**: Can be disabled to potentially increase quality at very low bitrates - **Frequency Range**: Option to keep all frequencies (disables automatic lowpass filtering) - **Voice Mode**: Experimental mode optimized for voice content ### Best Practices 1. **Choosing Rate Control for Different Applications** - Use CBR for streaming and real-time C# MP3 capturing - Use Quality-based VBR for archival and highest quality .NET MP3 recording - Use ABR when you need a balance between consistent size and quality 2. **Quality Settings for Different Use Cases** - For archival: Use VBR with quality 0-2 - For general C# audio video capture: VBR with quality 3-5 or CBR 192-256kbps - For voice recording in .NET: Consider using voice encoding mode with lower bitrates 3. **Channel Mode Selection** - Use Joint Stereo for most music content - Use Standard Stereo for critical listening and complex stereo mixes - Use Mono for voice recordings or when bandwidth is critical 4. **Performance Optimization** - Use Fast encoding quality for real-time applications - Use Standard quality for general purpose encoding - Use High quality only for archival purposes where encoding time is not critical ### Notes on Default Values The class constructor sets these default values: - CBR_Bitrate = 192 kbps - VBR_MinBitrate = 96 kbps - VBR_MaxBitrate = 192 kbps - VBR_Quality = 6 - EncodingQuality = 6 - SampleRate = 44100 Hz - ChannelsMode = MP3ChannelsMode.StandardStereo - VBR_Mode = true ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\opus.md --- title: Optimizing OPUS Audio Encoding in .NET Applications description: Comprehensive guide to implementing and optimizing OPUS audio encoding in .NET applications using VisioForge SDKs. Learn how to configure bitrate control modes, adjust complexity settings, and implement proper frame durations for high-quality, bandwidth-efficient audio streaming and storage solutions across various application scenarios from VoIP to music streaming. sidebar_label: OPUS --- # Mastering OPUS Audio Encoding in .NET Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to OPUS Audio Encoding OPUS stands as one of the most versatile and efficient audio codecs available for modern software development. VisioForge .NET SDKs include a royalty-free OPUS encoder that transforms audio into the highly adaptable Opus format. This encoded audio can be encapsulated in various containers including Ogg, Matroska, WebM, or RTP streams, making it ideal for both streaming applications and stored media. Developed by the Internet Engineering Task Force (IETF), OPUS combines the best elements of the SILK and CELT codecs to deliver exceptional performance across a wide range of audio requirements. The codec excels in both speech and music encoding at bitrates from as low as 6 kbps to as high as 510 kbps, offering developers remarkable flexibility in balancing quality against bandwidth constraints. ## Why Choose OPUS for Your .NET Applications OPUS has become the preferred choice for many audio applications for several compelling reasons: - **Low Latency**: With encoding delays as low as 5ms, OPUS is perfect for real-time communication applications - **Adaptive Bitrate**: Seamlessly switches between speech and music optimization - **Wide Bitrate Range**: Functions effectively from 6 kbps to 510 kbps - **Superior Compression**: Offers better quality than MP3, AAC, and other codecs at equivalent bitrates - **Open Standard**: Royalty-free and open-source, reducing licensing concerns - **Cross-Platform Support**: Works across all major platforms and browsers These advantages make OPUS particularly valuable for developers building applications that require efficient audio streaming, VoIP solutions, or any scenario where audio quality and bandwidth efficiency are crucial considerations. ## Implementing OPUS in Cross-Platform .NET Applications [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] When working with VisioForge's cross-platform X-engines, developers can leverage the [OPUSEncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.AudioEncoders.OPUSEncoderSettings.html) class to configure OPUS encoding parameters precisely for their application needs. ### Essential OPUS Encoder Configuration Properties To achieve optimal results with the OPUS encoder, understanding and configuring these key properties is essential: - **Bitrate**: Sets the target bitrate in Kbps, determining the balance between quality and file size - **Rate Control Mode**: Selects between Variable Bitrate (VBR), Constant Bitrate (CBR), or Constrained Variable Bitrate (CVBR) - **Complexity**: Controls encoding complexity on a scale from 0-10, where higher values produce better quality at the expense of increased CPU usage - **Frame Duration**: Configures the frame size (2.5, 5, 10, 20, 40, or 60ms), with shorter frames providing lower latency at the cost of encoding efficiency - **Application Type**: Optimizes for either voice or music content, allowing the encoder to apply specialized techniques - **Forward Error Correction**: Enables packet loss resilience for streaming applications - **DTX (Discontinuous Transmission)**: Reduces bandwidth during silence periods Each of these parameters can significantly impact audio quality, processing requirements, and bandwidth consumption, making them critical considerations for developers optimizing for specific application scenarios. ## Understanding Bitrate Control Modes in Depth One of the most important decisions when implementing OPUS encoding is selecting the appropriate bitrate control strategy. OPUS offers three primary modes, each with distinct advantages for different use cases. ### Variable Bitrate (VBR) VBR represents the most efficient approach for quality optimization, allowing the encoder to dynamically adjust bitrate based on audio complexity. This results in higher bitrates for complex passages and lower bitrates for simpler content. ```cs // Create an instance of the OPUSEncoderSettings class. var opus = new OPUSEncoderSettings(); // Set rate control mode to VBR opus.RateControl = OPUSRateControl.VBR; // Set audio bitrate for the codec (in Kbps) opus.Bitrate = 128; ``` **Best for**: On-demand audio streaming, podcast distribution, music applications, and any scenario where consistent bandwidth isn't a primary concern. **Key advantage**: Provides the highest quality-to-size ratio by allocating more bits to complex audio sections. ### Constant Bitrate (CBR) CBR mode attempts to maintain a consistent bitrate throughout the encoding process. While OPUS is inherently a variable bitrate codec, its CBR implementation keeps fluctuations minimal, typically within 5% of the target. ```cs // Create an instance of the OPUSEncoderSettings class. var opus = new OPUSEncoderSettings(); // Set rate control mode to CBR opus.RateControl = OPUSRateControl.CBR; // Set audio bitrate for the codec (in Kbps) opus.Bitrate = 128; ``` **Best for**: Live streaming applications, VoIP systems, videoconferencing, and scenarios where network bandwidth predictability is critical. **Key advantage**: Maintains consistent bandwidth utilization, making it easier to plan network capacity and ensure reliable transmission. ### Constrained Variable Bitrate (CVBR) CVBR offers a middle-ground approach, allowing bitrate variations based on content complexity while imposing constraints to prevent extreme fluctuations. This provides many of VBR's quality benefits while keeping bandwidth requirements more predictable. ```cs // Create an instance of the OPUSEncoderSettings class. var opus = new OPUSEncoderSettings(); // Set rate control mode to Constrained VBR opus.RateControl = OPUSRateControl.ConstrainedVBR; // Set audio bitrate for the codec (in Kbps) opus.Bitrate = 128; ``` **Best for**: Adaptive streaming applications, mixed-content broadcasting, and scenarios where quality is important but bandwidth constraints still exist. **Key advantage**: Balances quality optimization with reasonable bandwidth predictability. ## Bitrate Selection Guidelines Setting an appropriate bitrate involves balancing quality requirements against bandwidth limitations. For OPUS encoding, consider these channel-specific recommendations: **For Mono Audio:** - 6-12 kbps: Acceptable for low-bitrate speech - 16-24 kbps: Good quality speech - 32-64 kbps: High-quality speech and acceptable music - 64-128 kbps: High-quality music **For Stereo Audio:** - 16-32 kbps: Low-quality stereo - 48-64 kbps: Good quality stereo speech - 64-128 kbps: Standard quality stereo music - 128-256 kbps: High-quality stereo music While OPUS can technically support bitrates up to 510 kbps, most applications achieve excellent results well below 192 kbps due to the codec's exceptional efficiency. ## Practical Implementation Examples ### Implementing OPUS in Video Capture Applications The following example demonstrates how to add OPUS output to a Video Capture SDK core instance: ```csharp // Create a Video Capture SDK core instance var core = new VideoCaptureCoreX(); // Create a OPUS output instance var opusOutput = new OPUSOutput("output.opus"); // Set the bitrate and rate control mode opusOutput.Audio.RateControl = OPUSRateControl.CBR; opusOutput.Audio.Bitrate = 128; // Add the OPUS output core.Outputs_Add(opusOutput, true); ``` ### Configuring OPUS for Video Editing Workflows When working with the Video Edit SDK, you can configure OPUS as your output format: ```csharp // Create a Video Edit SDK core instance var core = new VideoEditCoreX(); // Create a OPUS output instance var opusOutput = new OPUSOutput("output.opus"); // Set the bitrate for high-quality music encoding opusOutput.Audio.RateControl = OPUSRateControl.VBR; opusOutput.Audio.Bitrate = 192; // Set the output format core.Output_Format = opusOutput; ``` ### Creating OPUS Outputs with Media Blocks SDK The Media Blocks SDK offers flexible options for creating OPUS outputs in different container formats: ```csharp // Create a OPUS encoder settings instance with specific configuration var opusSettings = new OPUSEncoderSettings { Bitrate = 128, RateControl = OPUSRateControl.VBR, Complexity = 8 }; // Create a Ogg OPUS output instance var oggOpusOutput = new OGGOpusOutputBlock("output.ogg", opusSettings); // Alternatively, create a WebM OPUS output var webmOpusOutput = new WebMOpusOutputBlock("output.webm", opusSettings); ``` ## Performance Optimization Tips To achieve the best results with OPUS encoding in your .NET applications: 1. **Match Complexity to Your Hardware**: For real-time applications on limited hardware, use lower complexity values (3-6). For offline encoding or on powerful systems, higher values (7-10) will yield better quality. 2. **Select Appropriate Frame Duration**: Shorter frames (2.5-10ms) minimize latency for real-time communication, while longer frames (20-60ms) improve compression efficiency for music and stored content. 3. **Consider Input Sample Rate**: OPUS performs optimally with 48kHz input. If your source is at a different sample rate, consider resampling to 48kHz before encoding. 4. **Optimize for Content Type**: Use the Application property to tell the encoder whether you're primarily encoding speech or music for content-specific optimizations. 5. **Enable DTX for Speech**: For voice communications with frequent silence, enabling DTX can significantly reduce bandwidth requirements without noticeable quality impact. ## Conclusion The OPUS codec offers .NET developers an exceptional tool for creating high-quality, bandwidth-efficient audio applications. With VisioForge's SDKs, implementing OPUS encoding becomes straightforward while still providing the flexibility to fine-tune every aspect of the encoding process. By understanding the bitrate control modes, selecting appropriate parameters, and following the implementation examples provided, you can leverage OPUS to deliver superior audio experiences in your .NET applications regardless of whether you're building real-time communication tools, media players, or content creation software. ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\speex.md --- title: Speex Audio Encoder Integration for .NET description: Learn how to implement Speex speech compression in .NET applications. Master audio capture, encoding, and streaming with optimized settings for voice applications. Includes code examples and best practices for developers. sidebar_label: Speex --- # Speex Audio Encoder for .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Speex Speex is a patent-free audio codec specifically designed for speech encoding in .NET applications. Whether you need to capture, edit, or record audio in C#, Speex provides excellent compression while maintaining voice quality across various bitrates. VisioForge integrates this powerful encoder into its .NET SDKs, offering developers flexible configuration options for speech-based applications. The codec is particularly well-suited for C# developers looking to implement high-quality audio capture and recording features in their applications. ## Core Functionality The Speex encoder in VisioForge SDKs supports: - Multiple frequency bands for different quality levels - Variable and fixed bitrate encoding - Voice activity detection and silence compression - Adjustable complexity and quality settings - Cross-platform compatibility across Windows, macOS, and Linux - Seamless integration with dotnet applications ## Cross-platform Implementation [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ### Encoder Modes Speex offers four operation modes optimized for different frequency ranges: | Mode | Value | Optimal Sample Rate | |------|-------|-------------------| | Auto | 0 | Automatic selection based on input | | Ultra Wide Band | 1 | 32 kHz | | Wide Band | 2 | 16 kHz | | Narrow Band | 3 | 8 kHz | The encoder automatically adjusts internal parameters based on the selected mode. For most speech applications, Wide Band (mode 2) offers an excellent balance between quality and bandwidth usage. ## Technical Specifications ### Supported Sample Rates Speex works with three standard sampling frequencies: - 8,000 Hz - Best for telephone-quality audio (Narrow Band) - 16,000 Hz - Recommended for most voice applications (Wide Band) - 32,000 Hz - Highest quality speech encoding (Ultra Wide Band) ### Channel Configuration The encoder handles both: - Mono (1 channel) - Ideal for speech recordings - Stereo (2 channels) - For multi-speaker or immersive audio ## Rate Control Methods ### Quality-Based Encoding For consistent perceptual quality, use the `Quality` parameter: ```csharp var settings = new SpeexEncoderSettings { Quality = 8.0f, // Range from 0 (lowest) to 10 (highest) VBR = false // Fixed quality mode }; ``` Higher quality values produce better audio at the expense of increased file size. Most speech applications work well with quality values between 5-8. ### Variable Bit Rate (VBR) VBR dynamically adjusts the bitrate based on speech complexity: ```csharp var settings = new SpeexEncoderSettings { VBR = true, Quality = 8.0f // Target quality level }; ``` This approach typically saves bandwidth while maintaining consistent perceived quality, making it ideal for streaming applications. ### Average Bit Rate (ABR) ABR maintains a target bitrate over time while allowing quality fluctuations: ```csharp var settings = new SpeexEncoderSettings { ABR = 15.0f, // Target bitrate in kbps VBR = true // Required for ABR mode }; ``` This option works well when you need predictable file sizes or bandwidth usage. ### Fixed Bitrate Encoding For consistent data rates throughout the encoding process: ```csharp var settings = new SpeexEncoderSettings { Bitrate = 24.6f, // Fixed rate in kbps VBR = false }; ``` Supported bitrates range from 2.15 kbps to 24.6 kbps: - 2.15 kbps - Ultra-compressed speech (limited quality) - 3.95 kbps - Low bandwidth voice - 5.95 kbps - Basic speech clarity - 8.00 kbps - Standard voice quality - 11.0 kbps - Good speech reproduction - 15.0 kbps - Near-transparent speech - 18.2 kbps - High-quality voice - 24.6 kbps - Maximum quality speech ## Voice Optimization Features ### Voice Activity Detection (VAD) VAD identifies the presence of speech in audio signals: ```csharp var settings = new SpeexEncoderSettings { VAD = true, // Enable voice detection DTX = true // Recommended with VAD }; ``` This feature improves bandwidth efficiency by focusing encoding resources on actual speech segments. ### Discontinuous Transmission (DTX) DTX reduces data transmission during silence periods: ```csharp var settings = new SpeexEncoderSettings { DTX = true // Enable silence compression }; ``` For VoIP and real-time communications, enabling DTX can significantly reduce bandwidth requirements. ### Encoding Complexity Control CPU usage versus encoding quality: ```csharp var settings = new SpeexEncoderSettings { Complexity = 3 // Range: 1 (fastest) to 10 (highest quality) }; ``` Lower values prioritize speed and reduce CPU load, while higher values improve audio quality at the cost of performance. ## Implementation Examples ### Checking Encoder Availability Always verify encoder availability before implementing Speex in your C# application: ```csharp if (!SpeexEncoderSettings.IsAvailable()) { throw new InvalidOperationException("Speex encoder not available on this system."); } ``` ### Basic Configuration for Audio Capture Here's how to set up basic Speex encoding for audio capture in dotnet: ```csharp var encoderSettings = new SpeexEncoderSettings { Mode = SpeexEncoderMode.WideBand, SampleRate = 16000, Channels = 1, Quality = 7.0f }; ``` ### Optimized for Voice Recording For voice recording applications in .NET, use these optimized settings: ```csharp var voipSettings = new SpeexEncoderSettings { Mode = SpeexEncoderMode.WideBand, SampleRate = 16000, Channels = 1, VBR = true, VAD = true, DTX = true, Quality = 6.0f, Complexity = 4 }; ``` ### Highest Quality Audio Capture For maximum quality audio capture in dotnet: ```csharp var highQualitySettings = new SpeexEncoderSettings { Mode = SpeexEncoderMode.UltraWideBand, SampleRate = 32000, Channels = 2, Bitrate = 24.6f, Complexity = 8 }; ``` ## SDK Integration ### Video Capture SDK Integration Learn how to capture audio using Speex in your C# application: ```csharp using VisioForge.Core.Types.Events; using VisioForge.Core.Types.X.AudioEncoders; using VisioForge.Core.Types.X.Output; using VisioForge.Core.Types.X.Sources; // Create a Video Capture SDK core instance var core = new VideoCaptureCoreX(); // Set the audio input device, filter by API var api = AudioCaptureDeviceAPI.DirectSound; var audioInputDevice = (await DeviceEnumerator.Shared.AudioSourcesAsync()).FirstOrDefault(x => x.API == api); if (audioInputDevice == null) { MessageBox.Show("No audio input device found."); return; } var audioInput = new AudioCaptureDeviceSourceSettings(api, audioInputDevice, audioInputDevice.GetDefaultFormat()); core.Audio_Source = audioInput; // Configure Speex settings var speexSettings = new SpeexEncoderSettings { Mode = SpeexEncoderMode.WideBand, SampleRate = 16000, Channels = 1, VBR = true, Quality = 7.0f }; var speexOutput = new SpeexOutput("output.spx", speexSettings); // Add the Speex output core.Outputs_Add(speexOutput, true); // Set the audio record mode core.Audio_Record = true; core.Audio_Play = false; // Start the capture await core.StartAsync(); // Stop after 10 seconds await Task.Delay(10000); // Stop the capture await core.StopAsync(); ``` ### Video Edit SDK Integration Edit and process audio files using Speex in dotnet: ```csharp using VisioForge.Core.Types.Events; using VisioForge.Core.Types.X.AudioEncoders; using VisioForge.Core.Types.X.Output; using VisioForge.Core.Types.X.Sources; // Create a Video Edit SDK core instance var core = new VideoEditCoreX(); // Add the audio source file var audioFile = new AudioFileSource(@"c:\samples\!audio.mp3"); VideoEdit1.Input_AddAudioFile(audioFile, null); // Configure Speex settings var speexSettings = new SpeexEncoderSettings { Mode = SpeexEncoderMode.WideBand, SampleRate = 16000, Channels = 1, VBR = true, Quality = 7.0f }; var speexOutput = new SpeexOutput(@"output.spx", speexSettings); // Add the Speex output core.Output_Format = speexOutput; // Catch OnStop event core.OnStop += (s, e) => { // Handle the stop event here MessageBox.Show("Editing complete."); }; core.OnProgress += (s, e) => { // Handle progress updates here Debug.WriteLine($"Progress: {e.Progress}%"); }; core.OnError += (s, e) => { // Handle errors here Debug.WriteLine($"Error: {e.Message}"); }; // Start the editing core.Start(); ``` ### Media Blocks SDK Integration Process audio streams using Speex in your .NET application: ```csharp using VisioForge.Core; using VisioForge.Core.MediaBlocks; using VisioForge.Core.MediaBlocks.AudioEncoders; using VisioForge.Core.MediaBlocks.Sinks; using VisioForge.Core.MediaBlocks.Sources; using VisioForge.Core.Types.Events; using VisioForge.Core.Types.X.AudioEncoders; using VisioForge.Core.Types.X.Output; using VisioForge.Core.Types.X.Sources; // Create a new pipeline var pipeline = new MediaBlocksPipeline(); // Add universal source to read audio file var sourceSettings = await UniversalSourceSettings.CreateAsync(@"c:\samples\!audio.mp3", renderVideo: false, renderAudio: true); var source = new UniversalSourceBlock(sourceSettings); // Add Speex output var speexSettings = new SpeexEncoderSettings { Mode = SpeexEncoderMode.NarrowBand, SampleRate = 8000, DTX = true, VAD = true }; var speexOutput = new OGGSpeexOutputBlock("output.spx", speexSettings); // Connect pipeline.Connect(source.AudioOutput, speexOutput.Input); // Add OnStop event handler pipeline.OnStop += (sender, e) => { // Do something when the pipeline stops MessageBox.Show("Conversion complete"); }; // Start await pipeline.StartAsync(); ``` ## Performance Optimization When implementing Speex encoding, consider these optimization strategies: 1. **Match sample rate to content** - Use Narrow Band (8 kHz) for telephone audio, Wide Band (16 kHz) for most voice applications, and Ultra Wide Band (32 kHz) only when maximum quality is required 2. **Enable VBR with VAD/DTX** for speech content - This combination provides optimal bandwidth efficiency for typical voice recordings 3. **Adjust complexity based on platform** - Mobile applications may benefit from lower complexity values (2-4), while desktop applications can use higher values (5-8) 4. **Use ABR for streaming** - Average Bit Rate provides predictable bandwidth usage while maintaining quality flexibility 5. **Test different quality settings** - Often a quality setting of 5-7 provides excellent results without excessive file size ## Use Cases Speex encoding excels in these developer scenarios: - VoIP applications and internet telephony - Voice chat features in games and collaboration tools - Podcast creation and distribution - Speech recognition preprocessing - Voice note applications - Audio archiving of speech content ## Installation and Setup To get started with Speex in your dotnet application, check the main installation guide [here](../../install/index.md). ## Common Use Cases ### Audio Capture and Recording For streaming applications, use these optimized settings: ```csharp var streamingSettings = new SpeexEncoderSettings { Mode = SpeexEncoderMode.WideBand, SampleRate = 16000, Channels = 1, VBR = true, VAD = true, DTX = true, Quality = 6.0f, Complexity = 3 }; ``` ### Voice Over IP Applications For VoIP applications, prioritize low latency and bandwidth efficiency: ```csharp var voipSettings = new SpeexEncoderSettings { Mode = SpeexEncoderMode.NarrowBand, SampleRate = 8000, Channels = 1, VBR = true, VAD = true, DTX = true, Quality = 5.0f, Complexity = 2 }; ``` ## Licensing and Community Speex is released under the BSD license, making it free for both commercial and non-commercial use. The codec is actively maintained by the open-source community, with regular updates and improvements. ## Frequently Asked Questions ### What is the best bitrate for voice recording? For most voice applications, a bitrate between 8-15 kbps provides excellent quality while maintaining reasonable file sizes. Use VBR mode for optimal results. ### How does Speex compare to other codecs? Speex offers superior speech quality compared to many other codecs at similar bitrates, especially for voice content. It's particularly effective for low-bitrate applications. ### Can I use Speex for music encoding? While Speex can encode music, it's specifically optimized for speech. For music content, consider using other codecs like AAC or MP3. ## Conclusion The VisioForge implementation of Speex provides .NET developers with a powerful tool for capturing, editing, and recording audio in C# applications. Whether you're building a new voice capture application or enhancing an existing one, Speex delivers exceptional results with minimal resource usage. The codec's flexibility and performance make it an excellent choice for any .NET developer working with audio processing. ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\vorbis.md --- title: Vorbis Audio Encoding Guide for .NET Development description: Master Vorbis audio encoding in .NET applications with practical implementation strategies, quality optimization techniques, and cross-platform considerations. Learn to balance audio quality with file size for streaming and multimedia projects. sidebar_label: Vorbis --- # Vorbis Audio Encoding for .NET Developers [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Vorbis in VisioForge SDK The VisioForge SDK suite offers powerful Vorbis audio encoding capabilities that enable developers to implement high-quality audio compression in their .NET applications. Vorbis, an open-source audio codec, delivers exceptional audio fidelity with efficient compression ratios, making it ideal for streaming applications, multimedia content, and web audio. This guide will help you navigate the various Vorbis implementation options available in the VisioForge SDK ecosystem, providing practical code examples and optimization strategies for different use cases. ## Cross-Platform Vorbis Encoder [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] VisioForge's Vorbis implementations work across multiple platforms, giving you flexibility in deployment environments. The cross-platform components are specifically designed to function consistently across different operating systems. ### Implementation Options The SDK provides three distinct approaches to Vorbis encoding, each tailored to specific development scenarios: #### 1. WebM Container with Vorbis Audio The [WebM output](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.WebMOutput.html) implementation encapsulates Vorbis audio within the WebM container format. This option is particularly well-suited for web-based applications and HTML5 video projects where broad browser compatibility is required. **Availability:** Windows platforms only #### 2. OGG Vorbis Dedicated Output For audio-focused applications, the [OGG Vorbis output](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.OGGVorbisOutput.html) provides a specialized encoder designed specifically for the OGG container format. This implementation offers more detailed control over audio encoding parameters. **Availability:** Windows platforms only #### 3. Flexible VorbisEncoderSettings The [VorbisEncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.AudioEncoders.VorbisEncoderSettings.html) implementation provides the most versatile approach, supporting multiple container formats and offering extensive configuration options. This is the recommended choice for cross-platform development projects. **Availability:** All supported platforms ### Rate Control Strategies Choosing the appropriate rate control mode is crucial for balancing audio quality against file size requirements. Vorbis encoding in VisioForge supports two primary approaches: #### Quality-Based Variable Bit Rate (VBR) Quality-based VBR is the recommended approach for most applications, as it dynamically adjusts bitrate to maintain consistent perceptual quality throughout the audio stream. +++ WebMOutput WebMOutput implements a simplified quality-based approach with an easy-to-understand scale: ```cs // Create and configure WebM output with high-quality Vorbis audio var webmOutput = new WebMOutput(); // Quality range: 20 (lowest) to 100 (highest) // Values 70-80 provide excellent quality for most content webmOutput.Audio_Quality = 80; // Higher values produce better audio quality with larger files // Lower values prioritize file size over audio fidelity ``` Key considerations: - Quality setting directly impacts perceived audio quality and file size - Values around 70-80 work well for most professional content - Lower settings (40-60) may be suitable for voice-only recordings +++ OGGVorbisOutput OGGVorbisOutput offers more explicit quality mode selection: ```cs // Initialize OGG Vorbis output for quality-focused encoding var oggOutput = new OGGVorbisOutput(); // Set the encoding mode to quality-based VBR oggOutput.Mode = VorbisMode.Quality; // Configure quality level (range: 20-100) // 80: High quality for music and complex audio // 60: Good quality for general purpose use // 40: Acceptable quality for voice recordings oggOutput.Quality = 80; ``` This implementation gives you direct control over the quality-to-size tradeoff, making it ideal for applications with varying content types. +++ VorbisEncoderSettings VorbisEncoderSettings uses the native Vorbis quality scale: ```cs // Create Vorbis encoder with quality-based rate control var vorbisEncoder = new VorbisEncoderSettings(); // Set rate control mode to quality-based VBR vorbisEncoder.RateControl = VorbisEncoderRateControl.Quality; // Configure quality level using Vorbis scale (-1 to 10) // -1: Very low quality (~45 kbps) // 3: Good quality (~112 kbps) // 5: Very good quality (~160 kbps) // 8: Excellent quality (~224 kbps) // 10: Highest quality (~320 kbps) vorbisEncoder.Quality = 5; ``` The VorbisEncoderSettings implementation provides the most precise quality control, using the established Vorbis quality scale that audio engineers are familiar with. +++ #### Bitrate-Constrained Encoding For scenarios with specific bandwidth limitations or target file sizes, bitrate-constrained encoding offers more predictable output sizes. +++ WebMOutput WebMOutput does not support explicit bitrate control for Vorbis audio. Developers should use the quality parameter instead and test to determine the resulting bitrates. +++ OGGVorbisOutput OGGVorbisOutput provides comprehensive bitrate management tools: ```cs // Set up OGG output with specific bitrate constraints var oggOutput = new OGGVorbisOutput(); // Enable bitrate-controlled encoding mode oggOutput.Mode = VorbisMode.Bitrate; // Configure bitrate parameters (all values in Kbps) oggOutput.MinBitRate = 96; // Minimum bitrate floor oggOutput.AvgBitRate = 160; // Target average bitrate oggOutput.MaxBitRate = 240; // Maximum bitrate ceiling // These settings create a controlled VBR encode that // averages 160 Kbps but can fluctuate between limits ``` This approach is ideal for streaming applications where bandwidth prediction is important. +++ VorbisEncoderSettings VorbisEncoderSettings offers the most detailed bitrate control options: ```cs // Initialize Vorbis encoder with bitrate constraints var vorbisEncoder = new VorbisEncoderSettings(); // Set rate control mode to bitrate-based vorbisEncoder.RateControl = VorbisEncoderRateControl.Bitrate; // Configure bitrate parameters (all values in Kbps) vorbisEncoder.Bitrate = 192; // Target average bitrate vorbisEncoder.MinBitrate = 128; // Minimum allowed bitrate vorbisEncoder.MaxBitrate = 256; // Maximum allowed bitrate // These settings are ideal for applications requiring // predictable file sizes or streaming bandwidth ``` The flexible bitrate controls allow for precise audio encoding tailored to specific delivery requirements. +++ Check the [VorbisEncoderBlock](../../mediablocks/AudioEncoders/index.md) and [OGGSinkBlock](../../mediablocks/Sinks/index.md) for more information. ### Best Practices for Developers To achieve optimal results with Vorbis encoding in your .NET applications, consider these developer-focused recommendations: #### Choosing the Right Encoding Mode 1. **Default choice: Quality-based VBR** - Produces consistent perceived quality across varying content - Automatically optimizes bitrate based on audio complexity - Simplifies configuration with a single quality parameter 2. **When to use Bitrate-constrained mode:** - Streaming applications with bandwidth limitations - Storage-constrained environments with fixed size allocations - Content delivery networks with predictable bandwidth requirements #### Recommended Settings for Common Use Cases | Content Type | Recommended Settings | |-------------|----------------------| | Music (high quality) | WebM: Audio_Quality = 80
OGG: Quality = 80
VorbisEncoder: Quality = 6 | | Voice recordings | WebM: Audio_Quality = 60
OGG: Quality = 60
VorbisEncoder: Quality = 3 | | Mixed content | WebM: Audio_Quality = 70
OGG: Quality = 70
VorbisEncoder: Quality = 4 | | Streaming audio | OGG: Mode = Bitrate, AvgBitRate = 128
VorbisEncoder: RateControl = Bitrate, Bitrate = 128 | ## Windows-only output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The `OGGVorbisOutput` class provides configuration and functionality for encoding audio using the Vorbis codec. ### Class Details ```csharp public sealed class OGGVorbisOutput : IVideoEditBaseOutput, IVideoCaptureBaseOutput ``` The class implements two interfaces: - `IVideoEditBaseOutput`: Enables use in video editing scenarios - `IVideoCaptureBaseOutput`: Enables use in video capture scenarios ### Bitrate Controls When operating in Bitrate mode, these properties control the output bitrate constraints: #### AvgBitRate - Type: `int` - Default Value: 128 (Kbps) - Description: Specifies the target average bitrate for the encoded audio stream. This value represents the general quality level and file size trade-off. #### MaxBitRate - Type: `int` - Default Value: 192 (Kbps) - Description: Defines the maximum allowed bitrate during encoding. Useful for ensuring the encoded audio doesn't exceed bandwidth constraints. #### MinBitRate - Type: `int` - Default Value: 64 (Kbps) - Description: Sets the minimum allowed bitrate during encoding. Helps maintain a baseline quality level even during simple audio passages. ### Quality Controls #### Quality - Type: `int` - Default Value: 80 - Valid Range: 10-100 - Description: When operating in Quality mode, this value determines the encoding quality. Higher values result in better audio quality but larger file sizes. #### Mode - Type: `VorbisMode` (enum) - Default Value: `VorbisMode.Bitrate` - Options: - `VorbisMode.Quality`: Encoding focuses on maintaining a consistent quality level - `VorbisMode.Bitrate`: Encoding focuses on maintaining specified bitrate constraints ### Constructor ```csharp public OGGVorbisOutput() ``` Initializes a new instance with default values: - MinBitRate: 64 kbps - AvgBitRate: 128 kbps - MaxBitRate: 192 kbps - Quality: 80 - Mode: VorbisMode.Bitrate ### Serialization Methods #### Save() ```csharp public string Save() ``` Serializes the current configuration to a JSON string, allowing settings to be saved and restored later. #### Load(string json) ```csharp public static OGGVorbisOutput Load(string json) ``` Creates a new instance with settings deserialized from the provided JSON string. ### Usage Examples #### Basic Usage with Default Settings ```csharp var oggOutput = new OGGVorbisOutput(); // Ready to use with default settings (Bitrate mode, 128kbps average) ``` #### Quality-Based Encoding ```csharp var oggOutput = new OGGVorbisOutput { Mode = VorbisMode.Quality, Quality = 90 // High quality setting }; ``` #### Constrained Bitrate Encoding ```csharp var oggOutput = new OGGVorbisOutput { Mode = VorbisMode.Bitrate, MinBitRate = 96, // Minimum 96kbps AvgBitRate = 160, // Target 160kbps MaxBitRate = 240 // Maximum 240kbps }; ``` #### Saving and Loading Configuration ```csharp // Save configuration var oggOutput = new OGGVorbisOutput(); string savedConfig = oggOutput.Save(); ``` ```csharp // Load configuration var loadedOutput = OGGVorbisOutput.Load(savedConfig); ``` #### Apply settings to core instances ```csharp var core = new VideoCaptureCore(); core.Output_Filename = "output.ogg"; core.Output_Format = oggOutput; ``` ```csharp var core = new VideoEditCore(); core.Output_Filename = "output.ogg"; core.Output_Format = oggOutput; ``` ## Performance Considerations When implementing Vorbis encoding in production environments: - Encoding quality directly impacts CPU usage; higher quality settings require more processing power - The VorbisEncoderSettings implementation offers the best balance of flexibility and performance - Pre-configured profiles can help standardize output quality across different content types - Consider multi-threaded encoding for batch processing applications ## Conclusion Vorbis encoding provides an excellent open-source solution for high-quality audio compression in .NET applications. By understanding the different implementation options and configuration strategies available in the VisioForge SDK, developers can effectively balance audio quality, file size, and performance requirements for their specific use cases. Whether you're building a streaming application, a media processing tool, or integrating audio capabilities into a larger software ecosystem, the Vorbis encoders in VisioForge's .NET SDKs offer the flexibility and performance needed for professional audio processing. ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\wav.md --- title: WAV Audio Format Integration in .NET Applications description: Learn how to implement WAV audio processing in .NET applications with step-by-step examples. Discover best practices for sample rates, channel configuration, and format selection. Includes cross-platform implementation guides and code samples. sidebar_label: WAV --- # Implementing WAV Audio in .NET Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## What is WAV Format? WAV (Waveform Audio File Format) functions as an uncompressed audio container format rather than a codec. It stores raw PCM (Pulse-Code Modulation) audio data in its native form. When working with VisioForge SDKs, the WAV output functionality allows developers to create high-quality audio files with configurable PCM settings. Since WAV preserves audio without compression, it maintains original sound quality at the cost of larger file sizes compared to compressed formats like MP3 or AAC. ## How WAV Files Work The WAV format stores audio samples in their raw form. When your application outputs to WAV format, it performs three key operations: 1. Organizing raw PCM audio data into the WAV container structure 2. Defining interpretation parameters (sample rate, bit depth, and channel count) 3. Generating appropriate WAV headers and metadata This uncompressed nature means file sizes are predictable and directly calculated from the audio parameters: ```text File size (bytes) = Sample Rate × Bit Depth × Channels × Duration / 8 ``` For example, a one-minute stereo WAV file sampled at 44.1kHz with 16-bit samples consumes approximately 10.1 MB: ```text 44100 × 16 × 2 × 60 / 8 = 10,584,000 bytes ``` ## Cross-Platform WAV Implementation [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ### Key Features - Flexible audio format configuration (default: S16LE) - Adjustable sample rates ranging from 8kHz to 192kHz - Support for both mono and stereo channel configurations - Consistent audio quality across different platforms ### Configuration Parameters #### Audio Format Options The WAV encoder supports multiple audio formats through the `AudioFormatX` enum, with S16LE (16-bit Little-Endian) serving as the default format for maximum compatibility. #### Sample Rate Selection - Available range: 8,000 Hz to 192,000 Hz - Default setting: 48,000 Hz - Increment values: 8,000 Hz steps #### Channel Configuration - Available options: 1 (mono) or 2 (stereo) - Default setting: 2 (stereo) ### Implementation Examples #### Basic Implementation ```csharp // Initialize WAV encoder with default settings var wavEncoder = new WAVEncoderSettings(); ``` ```csharp // Initialize with custom configuration var customWavEncoder = new WAVEncoderSettings( format: AudioFormatX.S16LE, sampleRate: 44100, channels: 2 ); ``` #### Integration with Video Capture SDK ```csharp // Initialize Video Capture SDK core var core = new VideoCaptureCoreX(); // Create WAV output with file path var wavOutput = new WAVOutput("output.wav"); // Add output to the capture pipeline core.Outputs_Add(wavOutput, true); ``` #### Integration with Video Edit SDK ```csharp // Initialize Video Edit SDK core var core = new VideoEditCoreX(); // Create WAV output instance var wavOutput = new WAVOutput("output.wav"); // Configure core to use WAV output core.Output_Format = wavOutput; ``` #### Media Blocks Pipeline Configuration ```csharp // Initialize WAV encoder settings var wavSettings = new WAVEncoderSettings(); // Create encoder block var wavOutput = new WAVEncoderBlock(wavSettings); // Add File Sink block for output var fileSink = new FileSinkBlock("output.wav"); // Connect encoder to file sink in pipeline pipeline.Connect(wavOutput.Output, fileSink.Input); // pipeline is MediaBlocksPipeline ``` #### Verifying Encoder Availability ```csharp if (WAVEncoderSettings.IsAvailable()) { // Encoder is available, proceed with encoding var encoder = new WAVEncoderSettings(); // Configure and use encoder } else { // Handle unavailability Console.WriteLine("WAV encoder is not available on this system"); } ``` #### Advanced Configuration ```csharp var wavEncoder = new WAVEncoderSettings { Format = AudioFormatX.S16LE, SampleRate = 96000, Channels = 1 // Configure for mono audio }; ``` #### Creating an Encoder Block ```csharp var settings = new WAVEncoderSettings(); MediaBlock encoderBlock = settings.CreateBlock(); // Integrate the encoder block into your media pipeline ``` #### Retrieving Supported Parameters ```csharp // Get list of supported audio formats IEnumerable formats = WAVEncoderSettings.GetFormatList(); // Get available sample rates var settings = new WAVEncoderSettings(); int[] sampleRates = settings.GetSupportedSampleRates(); // Returns array ranging from 8000 to 192000 in 8000 Hz increments // Get supported channel configurations int[] channels = settings.GetSupportedChannelCounts(); // Returns [1, 2] for mono and stereo options ``` ## Windows-Specific WAV Implementation [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] ### Enumerating Available Audio Codecs ```csharp // core is an instance of VideoCaptureCore or VideoEditCore foreach (var codec in core.Audio_Codecs) { cbAudioCodecs.Items.Add(codec); } ``` ### Configuring Audio Settings ```csharp // Initialize ACM output for WAV var acmOutput = new ACMOutput(); // Configure audio parameters acmOutput.Channels = 2; acmOutput.BPS = 16; acmOutput.SampleRate = 44100; acmOutput.Name = "PCM"; // codec name // Set as output format core.Output_Format = acmOutput; ``` ### Specifying Output File ```csharp // Set output file path core.Output_Filename = "output.wav"; ``` ### Starting Processing ```csharp // Begin capture or conversion operation await core.StartAsync(); ``` ## Best Practices for WAV Implementation ### Sample Rate Selection Guidelines The sample rate significantly impacts audio quality and file size: - 8kHz: Suitable for basic voice recordings and telephony applications - 16kHz: Improved voice quality for speech recognition systems - 44.1kHz: Standard for CD-quality audio and music production - 48kHz: Professional audio standard used in video production - 96kHz+: High-resolution audio for professional sound engineering For most applications, 44.1kHz or 48kHz provides excellent quality without excessive file sizes. ### Channel Configuration Strategy Your channel selection should align with content requirements: - **Mono (1 channel)**: Ideal for voice recordings, podcasts, or when storage space is limited - **Stereo (2 channels)**: Essential for music, spatial audio, or any content where directional sound matters ### Format Selection Considerations When selecting audio formats: - S16LE (16-bit Little-Endian) offers the best compatibility across platforms - Higher bit depths (24-bit, 32-bit) provide greater dynamic range for professional audio work - Consider your target system's requirements and hardware capabilities ## Technical Limitations and Considerations ### File Size Implications WAV files grow linearly with recording duration, which can present challenges: - A 10-minute stereo recording at 44.1kHz/16-bit requires approximately 100MB - For mobile or web applications, consider implementing size limits or compression options - When streaming is required, compressed formats may be more appropriate ### Performance Factors WAV processing has specific performance characteristics: - Lower CPU usage during encoding compared to compressed formats - Higher disk I/O requirements due to larger data volumes - Memory buffer considerations for long recordings ## Conclusion The WAV format provides developers with a reliable, high-quality audio output option within VisioForge .NET SDKs. Its uncompressed nature ensures pristine audio quality, making it ideal for applications where audio fidelity is paramount. By leveraging the configuration options and implementation approaches outlined above, developers can effectively integrate WAV audio functionality into their .NET applications while maintaining optimal performance and quality. For most professional audio applications, WAV remains the format of choice during production and editing stages, even if compressed formats are used for final distribution. The flexibility and cross-platform compatibility of the VisioForge SDK's WAV implementation make it a valuable tool in any developer's audio processing toolkit. ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\wavpack.md --- title: WavPack Audio Encoder Integration for .NET description: Master WavPack audio compression in .NET applications with detailed guidance on compression modes, quality settings, and real-world implementation examples. Learn to optimize audio encoding for your specific needs. sidebar_label: WavPack --- # WavPack Audio Encoder for .NET Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ## Introduction to WavPack WavPack is a powerful audio codec that offers both lossless and hybrid lossy compression capabilities, making it highly versatile for different application requirements. The VisioForge.Core library provides a robust implementation of this codec for .NET developers seeking high-quality audio compression solutions. With support for various quality levels, correction modes, and stereo encoding options, the WavPack encoder can handle multiple channel configurations while delivering excellent compression across a wide range of bitrates and sample rates. ## Getting Started with WavPack ### Basic Configuration To begin using the WavPack encoder, you'll need to create an instance of the `WavPackEncoderSettings` class with your desired parameters: ```csharp var encoder = new WavPackEncoderSettings { Mode = WavPackEncoderMode.Normal, JointStereoMode = WavPackEncoderJSMode.Auto, CorrectionMode = WavPackEncoderCorrectionMode.Off, MD5 = false }; ``` This simple configuration uses balanced compression settings and automatic stereo encoding mode selection, suitable for most general use cases. ### Compression Modes Explained WavPack offers four distinct compression modes that balance processing speed against compression efficiency: ```csharp public enum WavPackEncoderMode { Fast = 1, // Prioritizes encoding speed Normal = 2, // Balanced compression (default) High = 3, // Higher compression ratio VeryHigh = 4 // Maximum compression } ``` For applications where file size is critical, you can implement higher compression settings: ```csharp var encoder = new WavPackEncoderSettings { Mode = WavPackEncoderMode.High, ExtraProcessing = 1 // Enables advanced filters for better compression }; ``` ## Quality Control Options ### Bitrate-Based Encoding The most straightforward method for controlling output quality is to specify a target bitrate: ```csharp var encoder = new WavPackEncoderSettings { Bitrate = 192000 // 192 kbps }; ``` Key specifications for bitrate control: - Valid range: 24,000 to 9,600,000 bits/second - Setting values below 24,000 disables lossy encoding - Enables the lossy encoding mode automatically ### Bits Per Sample Control For more precise quality control, especially when maintaining consistent quality across different sample rates is important: ```csharp var encoder = new WavPackEncoderSettings { BitsPerSample = 16.0 // Equivalent to 16-bit quality }; ``` Important notes: - Values below 2.0 disable lossy encoding - This approach maintains more consistent quality regardless of sample rate variations ## Advanced Encoding Features ### Stereo Encoding Options WavPack provides three methods for encoding stereo content, each with different characteristics: ```csharp var encoder = new WavPackEncoderSettings { JointStereoMode = WavPackEncoderJSMode.Auto }; ``` Available stereo encoding modes: - `Auto`: Intelligently selects the optimal encoding method based on content - `LeftRight`: Uses traditional left/right channel separation - `MidSide`: Implements mid/side encoding which often yields better compression for stereo material ### Hybrid Correction Mode One of WavPack's unique features is its hybrid mode, which generates a correction file alongside the main compressed file: ```csharp var encoder = new WavPackEncoderSettings { CorrectionMode = WavPackEncoderCorrectionMode.Optimized, Bitrate = 192000 // Required when using correction modes }; ``` The available correction options: - `Off`: Standard operation with no correction file - `On`: Generates a standard correction file - `Optimized`: Creates an optimization-focused correction file Note that correction modes only function when lossy encoding is active, making them ideal for applications where initial file size is important but future lossless restoration might be needed. ## Technical Specifications The WavPack encoder supports: - Sample rates from 6,000 Hz to 192,000 Hz - 1 to 8 audio channels - Optional MD5 hash storage of raw samples for verification - Additional processing options for quality enhancement Before implementation, you can verify encoder availability in your environment: ```csharp if (WavPackEncoderSettings.IsAvailable()) { // Configure and use the encoder var encoder = new WavPackEncoderSettings { Mode = WavPackEncoderMode.Normal, Bitrate = 192000, MD5 = true }; } ``` ## Implementation Examples ### Video Capture SDK Integration ```csharp // Initialize the Video Capture SDK core var core = new VideoCaptureCoreX(); // Create a WavPack output instance var wavPackOutput = new WavPackOutput("output.wv"); // Add the WavPack output to the capture pipeline core.Outputs_Add(wavPackOutput, true); ``` ### Video Edit SDK Integration ```csharp // Initialize the Video Edit SDK core var core = new VideoEditCoreX(); // Create a WavPack output instance var wavPackOutput = new WavPackOutput("output.wv"); // Set the output format core.Output_Format = wavPackOutput; ``` ### Media Blocks SDK Integration ```csharp // Configure WavPack encoder settings var wavPackSettings = new WavPackEncoderSettings(); // Create the encoder block var wavPackOutput = new WavPackEncoderBlock(wavPackSettings); // Create a file output destination var fileSink = new FileSinkBlock("output.wv"); // Connect the encoder to the file sink in the pipeline pipeline.Connect(wavPackOutput.Output, fileSink.Input); // pipeline is MediaBlocksPipeline ``` ## Optimization Strategies ### Performance vs. Quality For optimal encoder performance and quality balance: +++ Default - Use `Normal` mode for everyday encoding tasks - Enable `ExtraProcessing` only when encoding time isn't critical - Maintain `JointStereoMode` as `Auto` for most content types +++ Archival - Implement `High` or `VeryHigh` mode for archival purposes - Enable MD5 hash generation for content verification - Consider lossless encoding for critical audio preservation +++ Streaming - Use `Fast` mode for real-time encoding scenarios - Select an appropriate bitrate based on bandwidth constraints - Disable additional processing features to minimize latency +++ ## Best Practices When implementing WavPack in your applications: 1. **Balance quality and performance** by selecting the appropriate compression mode based on your use case 2. **Leverage hybrid mode** when distributing lossy files that may need lossless restoration later 3. **Consider format compatibility** with your target platforms and playback environments 4. **Test thoroughly** across different audio content types to ensure optimal settings ## Conclusion The WavPack encoder provides a versatile solution for audio compression in .NET applications. Whether you need archival-grade lossless compression or efficient lossy compression with future upgrade potential, the implementation in VisioForge's SDKs offers the flexibility and performance required by professional audio applications. By understanding the various configuration options and implementation strategies outlined in this guide, you can effectively integrate WavPack encoding into your software development projects and deliver high-quality audio processing capabilities to your users. ---END OF PAGE--- # Local File: .\dotnet\general\audio-encoders\wma.md --- title: Windows Media Audio Encoder Integration Guide description: Learn how to implement WMA audio encoding in .NET applications with cross-platform and Windows-specific approaches. Includes code examples, bitrate controls, and best practices for audio encoding implementation. sidebar_label: Windows Media Audio --- # Windows Media Audio encoder [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Windows Media Audio (WMA) is a popular audio codec developed by Microsoft for efficient audio compression. This documentation covers the WMA encoder implementations available in the VisioForge .Net SDKs. ## Overview The VisioForge SDK provides two distinct approaches for WMA encoding: the platform-specific [WMAOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.WMAOutput.html) for Windows environments and the cross-platform [WMAEncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.AudioEncoders.WMAEncoderSettings.html). Let's explore both implementations in detail to understand their capabilities and use cases. ## Cross-platform WMA output [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The `WMAEncoderSettings` provides a cross-platform solution for WMA encoding. This implementation is built on SDK and offers consistent behavior across different operating systems. ### Key Features The encoder supports the following audio configurations: - Sample rates: 44.1 kHz and 48 kHz - Bitrates: 128, 192, 256, and 320 Kbps - Channel configurations: Mono (1) and Stereo (2) ### Rate Control The WMA encoder implements constant bitrate (CBR) encoding, allowing you to specify a fixed bitrate from the supported values. This ensures consistent audio quality and predictable file sizes throughout the encoded content. ### Usage Example Add the WMA output to the Video Capture SDK core instance: ```csharp // Create a Video Capture SDK core instance var core = new VideoCaptureCoreX(); // Create a WMA output var wmaOutput = new WMAOutput("output.wma"); wmaOutput.Audio.SampleRate = 48000; wmaOutput.Audio.Channels = 2; wmaOutput.Audio.Bitrate = 320; // Add the WMA output core.Outputs_Add(wmaOutput, true); ``` Set the output format for the Video Edit SDK core instance: ```csharp // Create a Video Edit SDK core instance var core = new VideoEditCoreX(); // Create a WMA output var wmaOutput = new WMAOutput("output.wma"); wmaOutput.Audio.SampleRate = 48000; wmaOutput.Audio.Channels = 2; wmaOutput.Audio.Bitrate = 320; // Add the WMA output core.Output_Format = wmaOutput; ``` Create a Media Blocks WMA output instance: ```csharp // Create a WMA encoder settings instance var wmaSettings = new WMAEncoderSettings(); // Create a WMA output instance var wmaOutput = new WMAEncoderBlock(wmaSettings); // Create a ASF output instance var asfOutput = new ASFSinkBlock(new ASFSinkSettings("output.wma")); // Connect the WMA encoder to the ASF output pipeline.Connect(wmaOutput.Output, asfOutput.Input); // pipeline is MediaBlocksPipeline ``` Check if MP3 encoding is available. ``` if (!MP3EncoderSettings.IsAvailable()) { // Handle error } ``` ## Windows-only WMA output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The `WMAOutput` class provides a comprehensive Windows-specific implementation with advanced features and configuration options. This implementation leverages the Windows Media Format SDK for optimal performance on Windows systems. ### Key Features The Windows-specific implementation offers: - Multiple profile support (internal, external, and custom) - Language and localization settings - Quality-based encoding - Advanced bitrate control with peak bitrate settings - Buffer size configuration ### Rate Control The Windows implementation supports three stream modes through the WMVStreamMode enumeration: - CBR (Constant Bitrate) - VBR (Variable Bitrate) - Quality-based VBR ### Usage Example Here's how to set up the Windows-specific WMA encoder: Use an internal profile for simple configuration ```csharp var wmaOutput = new WMAOutput { // Use an internal profile for simple configuration Mode = WMVMode.InternalProfile, Internal_Profile_Name = "Windows Media Audio 9 High (192K)" }; core.Output_Format = wmaOutput; // Core is VideoCaptureCore or VideoEditCore ``` Or configure custom settings ```csharp var wmaOutput = new WMAOutput { Mode = WMVMode.CustomSettings, Custom_Audio_StreamPresent = true, Custom_Audio_Quality = 98, // High quality setting Custom_Audio_PeakBitrate = 320, // Maximum bitrate in Kbps Custom_Audio_PeakBufferSize = 3 // Buffer size for streaming }; core.Output_Format = wmaOutput; // Core is VideoCaptureCore or VideoEditCore ``` ### Profile Management The Windows implementation supports three profile modes: 1. Internal Profiles: - Pre-configured profiles for common use cases - Access through `Internal_Profile_Name` 2. External Profiles: - Load profiles from external files - Configure using `External_Profile_FileName` or `External_Profile_Text` 3. Custom Profiles: - Fine-grained control over encoding parameters - Configure through Custom_* properties ## Best Practices When implementing WMA encoding in your application: 1. For Windows applications requiring advanced features: - Use WMAOutput for access to Windows-specific optimizations - Consider saving configurations to JSON for reuse - Implement proper error handling for profile loading 2. For cross-platform applications: - Stick to WMAEncoderSettings for consistent behavior - Verify supported rates before setting configuration - Use the highest supported sample rate and bitrate for best quality This documentation provides a foundation for implementing WMA encoding in your applications. The choice between cross-platform and Windows-specific implementations should be based on your application's requirements for platform support, encoding features, and quality control. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\3rd-party-video-effects.md --- title: Integrating 3rd-party Video Processing Filters in .NET description: Learn how to implement and leverage third-party video processing filters in .NET applications. This practical guide provides code examples, best practices, and troubleshooting tips for developers working with DirectShow filters across multiple video SDK platforms. sidebar_label: 3rd-Party Video Effects --- # Integrating 3rd-party Video Processing Filters in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction Third-party video processing filters provide powerful capabilities for manipulating video streams in .NET applications. These filters can be seamlessly integrated into various SDK platforms including Video Capture SDK .Net, Media Player SDK .Net, and Video Edit SDK .Net to enhance your applications with advanced video processing features. This guide explores how to implement, configure, and optimize third-party DirectShow filters within your .NET projects, providing you with the knowledge needed to create sophisticated video processing applications. ## Understanding DirectShow Filters DirectShow filters are COM-based components that process media data within the DirectShow framework. They can perform various operations including: - Video effects and transitions - Color correction and grading - Frame rate conversion - Resolution changes - Noise reduction - Special effects processing Before using third-party filters, it's important to understand how they operate within the DirectShow pipeline and how they interact with our SDK components. ## Prerequisites To successfully implement third-party video processing filters in your .NET applications, you'll need: 1. The appropriate SDK (.NET Video Capture, Media Player, or Video Edit) 2. Third-party DirectShow filters of your choice 3. Administrative access for filter registration 4. Basic understanding of DirectShow architecture ## Filter Registration Process DirectShow filters must be properly registered on the system before they can be used in your applications. This is typically done using the Windows registration utility: ```cmd regsvr32.exe path\to\your\filter.dll ``` Alternative COM registration methods can also be used, particularly in scenarios where: - You need to register filters during application installation - You're working in environments with limited user permissions - You require silent registration as part of a deployment process ### Registration Troubleshooting If filter registration fails, verify: 1. You have administrator privileges 2. The filter DLL is compatible with your system architecture (x86/x64) 3. All dependencies of the filter are available on the system 4. The filter is properly implemented as a COM object ## Implementation Guide ### Enumerating Available DirectShow Filters Before adding filters to your processing chain, you may want to discover what filters are available on the system: ```cs // List all available DirectShow filters foreach (var directShowFilter in VideoCapture1.DirectShow_Filters) { Console.WriteLine($"Filter Name: {directShowFilter.Name}"); Console.WriteLine($"Filter CLSID: {directShowFilter.CLSID}"); Console.WriteLine($"Filter Path: {directShowFilter.Path}"); Console.WriteLine("----------------------------"); } ``` This code snippet allows you to inspect all registered DirectShow filters, helping you identify the correct filters to use in your application. ### Managing the Filter Chain Before adding new filters, you may want to clear any existing filters from the processing chain: ```cs // Remove all currently applied filters VideoCapture1.Video_Filters_Clear(); ``` This ensures you're starting with a clean processing pipeline and prevents unexpected interactions between filters. ### Adding Filters to Your Application To add a third-party filter to your video processing pipeline: ```cs // Create and add a custom filter CustomProcessingFilter myFilter = new CustomProcessingFilter("My Effect Filter"); // Configure filter parameters if needed myFilter.SetParameter("intensity", 0.75); myFilter.SetParameter("hue", 120); // Add the filter to the processing chain VideoCapture1.Video_Filters_Add(myFilter); ``` You can add multiple filters in sequence to create complex processing chains. The order of filters matters, as each filter processes the output of the previous one. ## Advanced Filter Configuration ### Filter Parameters Most third-party filters expose configurable parameters. These can be adjusted using filter-specific methods or through the DirectShow interface: ```cs // Using the IPropertyBag interface for configuration var propertyBag = (IPropertyBag)myFilter.GetPropertyBag(); object value = 0.5f; propertyBag.Write("Saturation", ref value); ``` ### Filter Ordering The sequence of filters in your processing chain significantly impacts the final result: ```cs // Example of a multi-filter processing chain VideoCapture1.Video_Filters_Add(new CustomProcessingFilter("Noise Reduction")); VideoCapture1.Video_Filters_Add(new CustomProcessingFilter("Color Enhancement")); VideoCapture1.Video_Filters_Add(new CustomProcessingFilter("Sharpening")); ``` Experiment with different filter arrangements to achieve the desired effect. For example, applying noise reduction before sharpening usually produces better results than the reverse order. ## Performance Considerations Third-party filters can impact application performance. Consider these optimization strategies: 1. Only enable filters when necessary 2. Use lower complexity filters for real-time processing 3. Consider the resolution and frame rate when applying multiple filters 4. Test performance with your target hardware configurations 5. Use profile-guided optimization when available ## Common Issues and Solutions ### Thread Safety When working with filters in multi-threaded applications, ensure proper synchronization: ```cs private readonly object _filterLock = new object(); public void UpdateFilter(CustomProcessingFilter filter) { lock (_filterLock) { // Update filter parameters filter.UpdateParameters(); } } ``` ## Required Components To successfully deploy applications that use third-party video processing filters, ensure you include: - SDK redistributables for your chosen platform - Any dependencies required by the third-party filters - Proper installation and registration scripts for the filters ## Conclusion Third-party video processing filters offer powerful capabilities for enhancing your .NET video applications. By following the guidelines in this document, you can successfully integrate these filters into your projects, creating sophisticated video processing solutions. Remember to test thoroughly with your target environment configurations to ensure optimal performance and compatibility. --- For more code samples and implementation details, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\asf-wmv-files-indexing.md --- title: ASF/WMV File Indexing in .NET - Complete Guide description: Learn how to implement robust indexing for ASF, WMV, and WMA files in .NET applications. This comprehensive tutorial with code examples shows developers how to solve seeking issues and optimize media file performance. sidebar_label: ASF and WMV Files Indexing --- # Complete Guide to ASF and WMV File Indexing in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) When working with Windows Media files in your .NET applications, you'll likely encounter challenges with seeking functionality, especially with files lacking proper index structures. This guide explains how to implement efficient indexing for ASF, WMV, and WMA files to ensure smooth playback and navigation capabilities in your applications. ## Understanding the Indexing Problem ASF (Advanced Systems Format) is Microsoft's container format designed for streaming media. WMV (Windows Media Video) and WMA (Windows Media Audio) are built on this format. While these formats are widely used, many files lack proper indexing structures, which creates several problems: - Choppy or unpredictable seeking behavior - Inability to jump to specific timestamps - Inconsistent playback when navigating through the file - Performance issues during random access operations Proper indexing creates a map of the file's content, allowing your application to quickly locate and access specific points in the media stream. ## Benefits of Implementing Media File Indexing Adding indexing capabilities to your .NET application provides several advantages: 1. **Improved User Experience**: Allows users to navigate media files with precise seeking 2. **Enhanced Performance**: Reduces processing overhead when jumping to specific points in media 3. **Broader File Compatibility**: Handle a wider range of ASF, WMV, and WMA files regardless of their original indexing 4. **Professional Media Handling**: Implement media player features expected in professional applications ## Implementation with the ASFIndexer Class The `VisioForge.Core.DirectShow.ASFIndexer` class provides a straightforward way to add indexing capabilities to your application. This class handles the complexity of analyzing and mapping media files, creating the necessary index structures for smooth seeking operations. ### Setting Up the ASFIndexer Before diving into code, ensure you have the proper references to the SDK in your project. Once set up, you can create an instance of the ASFIndexer class and configure it with appropriate event handlers. ### Core Code Implementation Here's a complete C# example showing how to implement ASF/WMV file indexing: ```cs using System; using System.Diagnostics; using System.Windows.Forms; using VisioForge.Core.DirectShow; namespace MediaIndexingExample { public class ASFIndexingManager { private ASFIndexer _indexer; public ASFIndexingManager() { // Initialize the indexer _indexer = new ASFIndexer(); // Set up event handlers _indexer.OnStop += Indexer_OnStop; _indexer.OnError += Indexer_OnError; _indexer.OnProgress += Indexer_OnProgress; } public void StartIndexing(string filePath) { try { // Begin the indexing process with optimized settings _indexer.Start( filePath, // Path to the media file WMIndexerType.FrameNumbers, // Index by frame numbers 4000, // Index density (higher = more precise seeking) WMIndexType.NearestDataUnit // Seek to nearest data unit for accuracy ); Debug.WriteLine($"Started indexing process for {filePath}"); } catch (Exception ex) { Debug.WriteLine($"Failed to start indexing: {ex.Message}"); throw; } } private void Indexer_OnStop(object sender, EventArgs e) { // Indexing has completed successfully MessageBox.Show("Indexing process has completed successfully."); // Additional post-indexing operations can be added here // Such as updating UI, releasing resources, or processing the indexed file } private void Indexer_OnError(object sender, ErrorsEventArgs e) { // Handle any errors that occurred during indexing MessageBox.Show($"An error occurred during the indexing process: {e.Message}"); // Log the error for troubleshooting Debug.WriteLine($"Indexing error: {e.Message}"); // Implement additional error recovery if needed } private void Indexer_OnProgress(object sender, ProgressEventArgs e) { // Update progress information Debug.WriteLine($"Indexing progress: {e.Progress}%"); // You can update a progress bar or other UI element here // UpdateProgressBar(e.Progress); } } } ``` ## Advanced Configuration Options The ASFIndexer provides several configuration options to customize the indexing process according to your specific requirements: ### Indexer Types The `WMIndexerType` enum offers two primary indexing approaches: - **FrameNumbers**: Indexes based on video frame numbers, ideal for precise video seeking - **TimeOffsets**: Indexes based on time positions, which can be more appropriate for audio files ### Index Density Settings The density parameter (set to 4000 in our example) controls the granularity of the index. Higher values create more detailed indexes for more precise seeking, but require more processing time and increase the resulting file size. ### Index Type Options The `WMIndexType` enum provides options for how seeking should be performed: - **NearestDataUnit**: Seeks to the nearest data unit, providing the most accurate seeking - **NearestCleanPoint**: Seeks to the nearest clean point, which may be faster but less precise - **Nearest**: Seeks to the nearest indexed point with standard precision ## Error Handling and Progress Monitoring Proper error handling and progress monitoring are essential for a robust indexing implementation. The ASFIndexer provides three key events: 1. **OnStop**: Triggered when indexing completes successfully 2. **OnError**: Triggered when an error occurs during indexing 3. **OnProgress**: Provides regular updates on indexing progress These events allow you to create a responsive UI that keeps users informed about the indexing process. ## Best Practices for ASF/WMV Indexing To ensure optimal performance and reliability: 1. **Pre-screen Files**: Check if files already have proper indexes before starting the indexing process 2. **Background Processing**: Perform indexing operations in a background thread to avoid UI freezing 3. **User Feedback**: Provide clear progress indicators during long indexing operations 4. **Caching**: Consider caching index information for frequently accessed files 5. **Error Recovery**: Implement graceful error handling for corrupted or unindexable files ## System Requirements and Dependencies To implement ASF/WMV indexing in your .NET application, ensure you have: - .NET Framework 4.5 or higher (compatible with .NET Core and .NET 5+) - Required redistributable components from the SDK - Sufficient system permissions to access and modify media files ## Conclusion Proper indexing of ASF, WMV, and WMA files significantly enhances the media handling capabilities of your .NET applications. By implementing the techniques outlined in this guide, you can provide users with smooth, professional-grade media navigation experiences. Remember that indexing is a processor-intensive operation that should ideally be performed only once per file, with the results cached or saved for future use. This approach ensures optimal performance while still providing all the benefits of properly indexed media files. --- For more code samples and advanced media processing techniques, check out our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\custom-filter-interface.md --- title: Working with Custom DirectShow Filter Interfaces in .NET description: Learn how to implement and use custom DirectShow filter interfaces in .NET applications. This guide provides step-by-step examples for accessing and manipulating DirectShow components through the IBaseFilter interface in your multimedia applications. sidebar_label: Custom Filter Interface Usage --- # Working with Custom DirectShow Filter Interfaces in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) *Note: The API shown in this guide is the same across all our SDK products, including Video Capture SDK .Net, Video Edit SDK .Net, and Media Player SDK .Net.* DirectShow is a powerful multimedia framework that allows developers to perform complex operations on media streams. One of its key strengths is the ability to work with custom filter interfaces, giving you precise control over media processing. This guide will walk you through implementing and utilizing custom DirectShow filter interfaces in your .NET applications. ## Understanding DirectShow Filters DirectShow uses a filter-based architecture where each filter performs a specific operation on the media stream. These filters are connected in a graph, creating a pipeline for media processing. ### Key DirectShow Components - **Filter**: A component that processes media data - **Pin**: Connection points between filters - **Filter Graph**: The complete pipeline of connected filters - **IBaseFilter**: The fundamental interface that all DirectShow filters implement ## Getting Started with Custom Filter Interfaces To work with DirectShow filters in .NET, you'll need to: 1. Add the proper references to your project 2. Access the filter through appropriate events 3. Cast the filter to the interface you need 4. Implement your custom logic ### Required Project References To access DirectShow functionality, include the appropriate package in your project: ```xml ``` You can also add the `VisioForge.Core` assembly reference directly to your project. ## Implementing Custom Filter Interface Access Our SDK provides several events that give you access to filters as they're added to the filter graph. Here's how to use them effectively: ### Accessing Filters in Video Capture SDK The Video Capture SDK offers the `OnFilterAdded` event that fires whenever a filter is added to the graph. This event provides access to each filter through its event arguments. ```cs // Subscribe to the OnFilterAdded event videoCaptureCore.OnFilterAdded += VideoCaptureCore_OnFilterAdded; // Event handler implementation private void VideoCaptureCore_OnFilterAdded(object sender, FilterAddedEventArgs eventArgs) { // Access the DirectShow filter interface IBaseFilter baseFilter = eventArgs.Filter as IBaseFilter; // Now you can work with the filter through the IBaseFilter interface if (baseFilter != null) { // Custom filter manipulation code goes here } } ``` ## Working with IBaseFilter Interface The `IBaseFilter` interface is the foundation of DirectShow filters. Here's what you can do with it: ### Retrieving Filter Information ```cs private void GetFilterInfo(IBaseFilter filter) { FilterInfo filterInfo = new FilterInfo(); int hr = filter.QueryFilterInfo(out filterInfo); if (hr >= 0) { Console.WriteLine($"Filter Name: {filterInfo.achName}"); // Don't forget to release the reference to the filter graph if (filterInfo.pGraph != null) { Marshal.ReleaseComObject(filterInfo.pGraph); } } } ``` ### Enumerating Filter Pins ```cs private void EnumerateFilterPins(IBaseFilter filter) { IEnumPins enumPins; int hr = filter.EnumPins(out enumPins); if (hr >= 0 && enumPins != null) { IPin[] pins = new IPin[1]; int fetched; while (enumPins.Next(1, pins, out fetched) == 0 && fetched > 0) { PinInfo pinInfo = new PinInfo(); pins[0].QueryPinInfo(out pinInfo); Console.WriteLine($"Pin Name: {pinInfo.name}, Direction: {pinInfo.dir}"); // Release pin and info if (pinInfo.filter != null) Marshal.ReleaseComObject(pinInfo.filter); Marshal.ReleaseComObject(pins[0]); } Marshal.ReleaseComObject(enumPins); } } ``` ## Identifying the Right Filter When working with the `OnFilterAdded` event, remember that it can be called multiple times as various filters are added to the graph. To work with a specific filter, you'll need to identify it correctly: ```cs private void VideoCaptureCore_OnFilterAdded(object sender, FilterAddedEventArgs eventArgs) { IBaseFilter baseFilter = eventArgs.Filter as IBaseFilter; if (baseFilter != null) { FilterInfo filterInfo = new FilterInfo(); baseFilter.QueryFilterInfo(out filterInfo); // Check if this is the filter we're looking for if (filterInfo.achName == "Video Capture Device") { // This is our target filter, perform specific operations ConfigureVideoCaptureFilter(baseFilter); } // Release the filter graph reference if (filterInfo.pGraph != null) { Marshal.ReleaseComObject(filterInfo.pGraph); } } } ``` ## Advanced Filter Configuration Once you have access to the filter interface, you can perform advanced configurations: ### Setting Filter Properties ```cs private void SetFilterProperty(IBaseFilter filter, Guid propertySet, int propertyId, object propertyValue) { IKsPropertySet propertySetInterface = filter as IKsPropertySet; if (propertySetInterface != null) { // Convert property value to byte array byte[] propertyData = ConvertToByteArray(propertyValue); // Set the property int hr = propertySetInterface.Set( propertySet, propertyId, IntPtr.Zero, 0, propertyData, propertyData.Length ); Marshal.ReleaseComObject(propertySetInterface); } } ``` ### Retrieving Filter Properties ```cs private object GetFilterProperty(IBaseFilter filter, Guid propertySet, int propertyId, Type propertyType) { IKsPropertySet propertySetInterface = filter as IKsPropertySet; object result = null; if (propertySetInterface != null) { int dataSize = Marshal.SizeOf(propertyType); byte[] propertyData = new byte[dataSize]; int returnedDataSize; // Get the property int hr = propertySetInterface.Get( propertySet, propertyId, IntPtr.Zero, 0, propertyData, propertyData.Length, out returnedDataSize ); if (hr >= 0) { result = ConvertFromByteArray(propertyData, propertyType); } Marshal.ReleaseComObject(propertySetInterface); } return result; } ``` ## Common Use Cases for Custom Filter Interfaces ### Video Processing Filters When working with video, you might need to access specific properties of camera devices: ```cs private void ConfigureVideoCaptureFilter(IBaseFilter captureFilter) { // Access and set camera properties IAMCameraControl cameraControl = captureFilter as IAMCameraControl; if (cameraControl != null) { // Set exposure cameraControl.Set(CameraControlProperty.Exposure, 0, CameraControlFlags.Manual); // Set focus cameraControl.Set(CameraControlProperty.Focus, 0, CameraControlFlags.Manual); Marshal.ReleaseComObject(cameraControl); } } ``` ### Audio Processing Filters For audio processing, you might want to adjust volume or audio quality settings: ```cs private void ConfigureAudioFilter(IBaseFilter audioFilter) { // Access volume interface IBasicAudio basicAudio = audioFilter as IBasicAudio; if (basicAudio != null) { // Set volume (0 to -10000, where 0 is max and -10000 is min) basicAudio.put_Volume(-2000); // 80% volume Marshal.ReleaseComObject(basicAudio); } } ``` ## Handling Resources Properly When working with DirectShow interfaces, it's crucial to properly release COM objects to prevent memory leaks: ```cs private void ReleaseComObject(object comObject) { if (comObject != null) { Marshal.ReleaseComObject(comObject); } } ``` ## Complete Example Here's a more complete example that demonstrates finding and configuring a video capture filter: ```cs using System; using System.Runtime.InteropServices; using VisioForge.Core.DirectShow; public class CustomFilterExample { private VideoCaptureCore captureCore; public void Initialize() { captureCore = new VideoCaptureCore(); captureCore.OnFilterAdded += CaptureCore_OnFilterAdded; // Configure source // ... // Start capture captureCore.Start(); } private void CaptureCore_OnFilterAdded(object sender, FilterAddedEventArgs eventArgs) { IBaseFilter baseFilter = eventArgs.Filter as IBaseFilter; if (baseFilter != null) { // Get filter information FilterInfo filterInfo = new FilterInfo(); baseFilter.QueryFilterInfo(out filterInfo); Console.WriteLine($"Filter added: {filterInfo.achName}"); // Check if this is the video capture filter if (filterInfo.achName.Contains("Video Capture")) { ConfigureVideoCaptureFilter(baseFilter); } // Release filter graph reference if (filterInfo.pGraph != null) { Marshal.ReleaseComObject(filterInfo.pGraph); } } } private void ConfigureVideoCaptureFilter(IBaseFilter captureFilter) { // Your filter configuration code here } public void Cleanup() { if (captureCore != null) { captureCore.Stop(); captureCore.OnFilterAdded -= CaptureCore_OnFilterAdded; captureCore.Dispose(); captureCore = null; } } } ``` ## Required System Components To use DirectShow functionality in your application, ensure your end-users have the following components installed: - DirectX Runtime (included with Windows) - SDK redistributable components ## Conclusion Working with custom DirectShow filter interfaces gives you powerful capabilities for media processing in your .NET applications. By following the patterns described in this guide, you can access and manipulate the underlying DirectShow components to achieve precise control over your multimedia applications. For additional assistance with implementing these techniques, please contact our support team. Visit our GitHub repository for more code samples and implementation examples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\custom-video-effects.md --- title: Creating Custom Video Effects in C# Applications description: Learn how to implement custom video effects in C# applications using OnVideoFrameBitmap and OnVideoFrameBuffer events. Discover practical code examples for real-time video processing including text overlays, grayscale conversion, brightness adjustments, and timestamp watermarks. sidebar_label: Custom Video Effects with Frame Events --- # Creating Custom Real-time Video Effects in C# Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction to Video Frame Processing When developing video applications, you often need to apply custom effects or overlays to video streams in real-time. The .NET SDK provides two powerful events for this purpose: `OnVideoFrameBitmap` and `OnVideoFrameBuffer`. These events give you direct access to each video frame, allowing you to modify pixels before they're rendered or encoded. ## Implementation Methods There are two primary approaches to implementing custom video effects: 1. **Using OnVideoFrameBitmap**: Process frames as Bitmap objects with GDI+ - easier to use but with moderate performance 2. **Using OnVideoFrameBuffer**: Manipulate raw RGB24 image buffer directly - offers better performance but requires more low-level code ## Code Examples for Custom Video Effects ### Text Overlay Implementation Adding text overlays to video is useful for watermarking, displaying information, or creating captions. This example demonstrates how to add simple text to your video frames: ```cs private void VideoCapture1_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { Graphics grf = Graphics.FromImage(e.Frame); grf.DrawString("Hello!", new Font(FontFamily.GenericSansSerif, 20), new SolidBrush(Color.White), 20, 20); grf.Dispose(); e.UpdateData = true; } ``` ### Grayscale Effect Implementation Converting video to grayscale is a fundamental image processing technique. This example shows how to access and modify individual pixel values: ```cs private void VideoCapture1_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { Bitmap bmp = e.Frame; Rectangle rect = new Rectangle(0, 0, bmp.Width, bmp.Height); System.Drawing.Imaging.BitmapData bmpData = bmp.LockBits(rect, System.Drawing.Imaging.ImageLockMode.ReadWrite, bmp.PixelFormat); IntPtr ptr = bmpData.Scan0; int bytes = Math.Abs(bmpData.Stride) * bmp.Height; byte[] rgbValues = new byte[bytes]; System.Runtime.InteropServices.Marshal.Copy(ptr, rgbValues, 0, bytes); // Apply standard luminance formula (0.3R + 0.59G + 0.11B) for accurate grayscale conversion for (int i = 0; i < rgbValues.Length; i += 3) { int gray = (int)(rgbValues[i] * 0.3 + rgbValues[i + 1] * 0.59 + rgbValues[i + 2] * 0.11); rgbValues[i] = (byte)gray; rgbValues[i + 1] = (byte)gray; rgbValues[i + 2] = (byte)gray; } System.Runtime.InteropServices.Marshal.Copy(rgbValues, 0, ptr, bytes); bmp.UnlockBits(bmpData); e.UpdateData = true; } ``` ### Brightness Adjustment Implementation This example demonstrates how to adjust the brightness of video frames - a common requirement in video processing applications: ```cs private void VideoCapture1_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { float brightness = 1.2f; // Values > 1 increase brightness, < 1 decrease it Bitmap bmp = e.Frame; Rectangle rect = new Rectangle(0, 0, bmp.Width, bmp.Height); System.Drawing.Imaging.BitmapData bmpData = bmp.LockBits(rect, System.Drawing.Imaging.ImageLockMode.ReadWrite, bmp.PixelFormat); IntPtr ptr = bmpData.Scan0; int bytes = Math.Abs(bmpData.Stride) * bmp.Height; byte[] rgbValues = new byte[bytes]; System.Runtime.InteropServices.Marshal.Copy(ptr, rgbValues, 0, bytes); // Apply brightness adjustment to each color channel for (int i = 0; i < rgbValues.Length; i++) { int newValue = (int)(rgbValues[i] * brightness); rgbValues[i] = (byte)Math.Min(255, Math.Max(0, newValue)); } System.Runtime.InteropServices.Marshal.Copy(rgbValues, 0, ptr, bytes); bmp.UnlockBits(bmpData); e.UpdateData = true; } ``` ### Timestamp Overlay Implementation Adding timestamps to video frames is essential for surveillance and logging applications. This example shows how to create a professional-looking timestamp with a semi-transparent background: ```cs private void VideoCapture1_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { Graphics grf = Graphics.FromImage(e.Frame); // Create a semi-transparent background for better readability Rectangle textBackground = new Rectangle(10, e.Frame.Height - 50, 250, 40); grf.FillRectangle(new SolidBrush(Color.FromArgb(128, 0, 0, 0)), textBackground); // Display current date and time string dateTime = DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss"); grf.DrawString(dateTime, new Font(FontFamily.GenericSansSerif, 16), new SolidBrush(Color.White), 15, e.Frame.Height - 45); grf.Dispose(); e.UpdateData = true; } ``` ## Performance Optimization Tips ### Working with Raw Buffer Data For high-performance applications, processing raw buffer data offers significant speed advantages: ```cs // OnVideoFrameBuffer event example (pseudo-code) private void VideoCapture1_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { // e.Buffer contains raw RGB24 data // Each pixel uses 3 bytes in RGB order // Process directly for maximum performance } ``` ### Best Practices for Frame Processing * **Memory Management**: Always dispose Graphics objects and unlock bitmapped data * **Performance Considerations**: For real-time processing, keep operations lightweight * **Buffer Processing**: We strongly recommend processing RAW data in the OnVideoFrameBuffer event for optimal performance * **External Libraries**: Consider using Intel IPP or other optimized image-processing libraries for complex operations --- ## Additional Resources Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to access more code samples and complete project examples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\draw-multitext-onvideoframebuffer.md --- title: Implementing Dynamic Text Overlays on Video Frames description: Learn how to create, position, and update multiple text overlays on video frames using the OnVideoFrameBuffer event in .NET. This detailed guide with code examples shows you how to customize text properties, handle dynamic updates, and optimize performance. sidebar_label: Draw Multiple Text Overlays Using OnVideoFrameBuffer Event --- # Implementing Dynamic Text Overlays on Video Frames in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction Adding text overlays to video content has become essential for various applications, from adding watermarks and timestamps to creating informative annotations and captions. While many SDKs offer built-in text overlay capabilities, these functions might not always provide the level of customization or flexibility required for advanced projects. This guide demonstrates how to implement custom text overlays using the `OnVideoFrameBuffer` event. This approach gives you full control over the text appearance, position, and behavior, allowing for more sophisticated overlay implementations than what's possible with standard API methods. ## Why Use Custom Text Overlays? Standard text overlay APIs often have limitations in areas such as: - Number of concurrent text elements - Font customization options - Dynamic text updates - Animation capabilities - Precise positioning control - Alpha channel management By leveraging the `OnVideoFrameBuffer` event and working directly with bitmap data, you can overcome these limitations and implement exactly what your application needs. ## Understanding the Approach The technique demonstrated in this article involves: 1. Creating a transparent bitmap with the same dimensions as the video frame 2. Drawing text elements onto this bitmap using GDI+ (System.Drawing) 3. Converting the bitmap to a memory buffer 4. Overlaying this buffer onto the video frame data 5. Optionally updating text elements dynamically This provides a powerful method for text overlay creation while maintaining good performance. ## Basic Implementation The following code sample shows a straightforward implementation for drawing multiple text overlays on video frames: ```cs // Image private Bitmap logoImage = null; // Image RGB32 buffer private IntPtr logoImageBuffer = IntPtr.Zero; private int logoImageBufferSize = 0; private string text1 = "Hello World"; private string text2 = "Hey-hey"; private string text3 = "Ocean of pancakes"; private void SDK_OnVideoFrameBuffer(Object sender, VideoFrameBufferEventArgs e) { // draw text to image if (logoImage == null) { logoImage = new Bitmap(e.Frame.Width, e.Frame.Height, PixelFormat.Format32bppArgb); using (var grf = Graphics.FromImage(logoImage)) { // antialiasing mode grf.TextRenderingHint = TextRenderingHint.AntiAlias; // drawing mode grf.InterpolationMode = InterpolationMode.HighQualityBicubic; // smoothing mode grf.SmoothingMode = SmoothingMode.HighQuality; // text 1 var brush1 = new SolidBrush(Color.Blue); var font1 = new Font("Arial", 30, FontStyle.Regular); grf.DrawString(text1, font1, brush1, 100, 100); // text 2 var brush2 = new SolidBrush(Color.Red); var font2 = new Font("Times New Roman", 35, FontStyle.Strikeout); grf.DrawString(text2, font2, brush2, e.Frame.Width / 2, e.Frame.Height / 2); // text 3 var brush3 = new SolidBrush(Color.Green); var font3 = new Font("Verdana", 40, FontStyle.Italic); grf.DrawString(text3, font3, brush3, 200, 200); } } // create image buffer if not allocated or have zero size if (logoImageBuffer == IntPtr.Zero || logoImageBufferSize == 0) { if (logoImageBuffer == IntPtr.Zero) { logoImageBufferSize = ImageHelper.GetStrideRGB32(logoImage.Width) * logoImage.Height; logoImageBuffer = Marshal.AllocCoTaskMem(logoImageBufferSize); } else { logoImageBufferSize = ImageHelper.GetStrideRGB32(logoImage.Width) * logoImage.Height; Marshal.FreeCoTaskMem(logoImageBuffer); logoImageBuffer = Marshal.AllocCoTaskMem(logoImageBufferSize); } ImageHelper.BitmapToIntPtr(logoImage, logoImageBuffer, logoImage.Width, logoImage.Height, PixelFormat.Format32bppArgb); } // Draw image FastImageProcessing.Draw_RGB32OnRGB24(logoImageBuffer, logoImage.Width, logoImage.Height, e.Frame.Data, e.Frame.Width, e.Frame.Height, 0, 0); e.UpdateData = true; } ``` ### Key Components Explained 1. **Bitmap Creation**: We create a 32-bit bitmap (with alpha channel) matching the video frame dimensions 2. **Graphics Settings**: We configure anti-aliasing, interpolation, and smoothing for high-quality text rendering 3. **Text Configuration**: Each text element gets its own font, color, and position 4. **Memory Management**: We allocate unmanaged memory for the bitmap buffer 5. **Bitmap to Buffer Conversion**: We convert the bitmap to a memory buffer using `ImageHelper.BitmapToIntPtr` 6. **Buffer Overlay**: We draw the RGBA buffer onto the video frame using `FastImageProcessing.Draw_RGB32OnRGB24` 7. **Frame Update Flag**: We set `e.UpdateData = true` to inform the SDK that the frame data has been modified ## Advanced Implementation with Dynamic Updates For more interactive applications, you might need to update text overlays dynamically. The following implementation supports on-the-fly updates of text content, fonts, and colors: ```cs // Image Bitmap logoImage = null; // Image RGB32 buffer IntPtr logoImageBuffer = IntPtr.Zero; int logoImageBufferSize = 0; // text settings string text1 = "Hello World"; Font font1 = new Font("Arial", 30, FontStyle.Regular); SolidBrush brush1 = new SolidBrush(Color.Blue); string text2 = "Hey-hey"; Font font2 = new Font("Times New Roman", 35, FontStyle.Strikeout); SolidBrush brush2 = new SolidBrush(Color.Red); string text3 = "Ocean of pancakes"; Font font3 = new Font("Verdana", 40, FontStyle.Italic); SolidBrush brush3 = new SolidBrush(Color.Green); // update flag bool textUpdate = false; object textLock = new object(); // Update text overlay, index is [1..3] void UpdateText(int index, string text, Font font, SolidBrush brush) { lock (textLock) { textUpdate = true; } switch (index) { case 1: text1 = text; font1 = font; brush1 = brush; break; case 2: text2 = text; font2 = font; brush2 = brush; break; case 3: text3 = text; font3 = font; brush3 = brush; break; default: return; } } private void SDK_OnVideoFrameBuffer(Object sender, VideoFrameBufferEventArgs e) { lock (textLock) { if (textUpdate) { logoImage.Dispose(); logoImage = null; } // draw text to image if (logoImage == null) { logoImage = new Bitmap(e.Frame.Width, e.Frame.Height, PixelFormat.Format32bppArgb); using (var grf = Graphics.FromImage(logoImage)) { // antialiasing mode grf.TextRenderingHint = TextRenderingHint.AntiAlias; // drawing mode grf.InterpolationMode = InterpolationMode.HighQualityBicubic; // smoothing mode grf.SmoothingMode = SmoothingMode.HighQuality; // text 1 grf.DrawString(text1, font1, brush1, 100, 100); // text 2 grf.DrawString(text2, font2, brush2, e.Frame.Width / 2, e.Frame.Height / 2); // text 3 grf.DrawString(text3, font3, brush3, 200, 200); } } // create image buffer if not allocated or have zero size if (logoImageBuffer == IntPtr.Zero || logoImageBufferSize == 0) { if (logoImageBuffer == IntPtr.Zero) { logoImageBufferSize = ImageHelper.GetStrideRGB32(e.Frame.Width) * e.Frame.Height; logoImageBuffer = Marshal.AllocCoTaskMem(logoImageBufferSize); } else { logoImageBufferSize = ImageHelper.GetStrideRGB32(e.Frame.Width) * e.Frame.Height; Marshal.FreeCoTaskMem(logoImageBuffer); logoImageBuffer = Marshal.AllocCoTaskMem(logoImageBufferSize); } ImageHelper.BitmapToIntPtr(logoImage, logoImageBuffer, logoImage.Width, logoImage.Height, PixelFormat.Format32bppArgb); } if (textUpdate) { textUpdate = false; ImageHelper.BitmapToIntPtr(logoImage, logoImageBuffer, logoImage.Width, logoImage.Height, PixelFormat.Format32bppArgb); } // Draw image FastImageProcessing.Draw_RGB32OnRGB24(logoImageBuffer, logoImage.Width, logoImage.Height, e.Frame.Data, e.Frame.Width, e.Frame.Height, 0, 0); e.UpdateData = true; } } private void btUpdateText1_Click(object sender, EventArgs e) { UpdateText(1, "Hello world", new Font("Arial", 48, FontStyle.Underline), new SolidBrush(Color.Aquamarine)); } ``` ### New Features in the Advanced Implementation 1. **Thread Safety**: We use a lock object to prevent concurrent access to shared resources 2. **Update Mechanism**: The `UpdateText` method provides a clean interface for changing text properties 3. **Text Property Storage**: Each text element has its own variables for content, font, and color 4. **Change Detection**: We use a flag (`textUpdate`) to indicate when text properties have changed 5. **Resource Management**: We dispose of the old bitmap when text properties change 6. **Buffer Update**: We update the memory buffer when text properties change 7. **UI Integration**: A sample button click handler demonstrates how to trigger text updates ## Performance Optimization Tips When implementing text overlays with this method, consider these performance optimizations: 1. **Minimize Bitmap Recreations**: Only recreate the bitmap when necessary (text changes, resolution changes) 2. **Cache Font Objects**: Font creation is expensive; create fonts once and reuse them 3. **Use Memory Efficiently**: Free unmanaged memory when it's no longer needed 4. **Optimize Drawing Operations**: Use hardware acceleration when available 5. **Consider Update Frequency**: For frequent updates, consider double-buffering techniques 6. **Profile Your Code**: Use performance profiling tools to identify bottlenecks ## Advanced Features to Consider This basic implementation can be extended with additional features: 1. **Text Animation**: Implement text movement, fading, or other animations 2. **Text Formatting**: Add support for rich text formatting (bold, italic, etc.) 3. **Text Effects**: Implement shadows, outlines, or glow effects 4. **Text Alignment**: Add support for different text alignment options 5. **Multi-Line Text**: Implement proper handling of multi-line text with wrapping 6. **Localization**: Add support for different languages and text directions 7. **Performance Monitoring**: Add diagnostics to monitor rendering performance ## Memory Management Considerations When working with unmanaged memory, it's crucial to handle resource cleanup properly: 1. Implement the `IDisposable` pattern in your class 2. Free unmanaged memory in the `Dispose` method 3. Consider using `SafeHandle` or similar constructs for safer resource management 4. Set buffer pointers to `IntPtr.Zero` after freeing them 5. Use structured exception handling around memory operations ## Cleanup Example ```cs protected override void Dispose(bool disposing) { if (disposing) { // Dispose managed resources if (logoImage != null) { logoImage.Dispose(); logoImage = null; } } // Free unmanaged resources if (logoImageBuffer != IntPtr.Zero) { Marshal.FreeCoTaskMem(logoImageBuffer); logoImageBuffer = IntPtr.Zero; logoImageBufferSize = 0; } base.Dispose(disposing); } ``` ## Required Dependencies - SDK redistributable components ## Conclusion Implementing custom text overlays using the `OnVideoFrameBuffer` event provides a powerful and flexible solution for applications that require advanced text display capabilities. While it requires more code than using built-in API methods, the additional flexibility and control make it worthwhile for sophisticated video applications. By following the patterns demonstrated in this guide, you can create dynamic, high-quality text overlays that can be updated in real-time, providing a rich user experience in your video applications. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\draw-video-picturebox.md --- title: Drawing Video on PictureBox in .NET Applications description: Learn step-by-step implementation of video rendering on PictureBox controls in WinForms applications. This tutorial covers frame handling, memory management, efficient rendering techniques, and best practices for smooth video display in desktop applications. sidebar_label: Drawing Video on PictureBox --- # Drawing Video on PictureBox in .NET Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction to Video Rendering in WinForms Displaying video content in desktop applications is a common requirement for many software developers working with multimedia. Whether you're building applications for video surveillance, media players, video editing tools, or any software that processes video streams, understanding how to effectively render video is crucial. The PictureBox control is one of the most straightforward ways to display video frames in Windows Forms applications. While it wasn't specifically designed for video playback, with proper implementation, it can provide smooth video rendering with minimal resource consumption. This guide focuses on implementing video rendering on PictureBox controls in .NET WinForms applications. We'll cover the entire process from setup to implementation, addressing common pitfalls and optimization techniques. ## Why Use PictureBox for Video Display? Before diving into implementation details, let's examine the advantages of using PictureBox for video display: - **Simplicity**: PictureBox is a straightforward control that most .NET developers are already familiar with. - **Flexibility**: It allows customization of how images are displayed through its SizeMode property. - **Integration**: It integrates seamlessly with other WinForms controls. - **Low overhead**: For many applications, it provides sufficient performance without requiring more complex DirectX or OpenGL implementations. However, it's important to note that PictureBox wasn't designed specifically for high-performance video playback. For applications requiring professional-grade video performance or hardware acceleration, more specialized rendering approaches might be necessary. ## Prerequisites To implement video rendering on a PictureBox, you'll need: - Basic knowledge of C# and .NET WinForms development - Visual Studio or another IDE for .NET development - A video source (from Video Capture SDK, Video Edit SDK, or Media Player SDK) - Understanding of event-driven programming ## Setting Up Your Environment ### Configuring the PictureBox Control 1. Add a PictureBox control to your form through the designer or programmatically. 2. Configure the basic properties for optimal video display: ```cs // Configure PictureBox for video display pictureBox1.BackColor = Color.Black; pictureBox1.SizeMode = PictureBoxSizeMode.StretchImage; ``` The `BackColor` property set to `Black` provides a clean background for video display, especially during initialization or when the video has black borders. The `SizeMode` property determines how the video frame fits within the control: - `StretchImage`: Stretches the image to fill the PictureBox (may distort aspect ratio) - `Zoom`: Maintains aspect ratio while filling the control - `CenterImage`: Centers the image without scaling - `Normal`: Displays the image at its original size For most video applications, `StretchImage` or `Zoom` work best, depending on whether maintaining aspect ratio is important. ## Implementation Steps ### Step 1: Prepare Your Class with Required Variables Add a boolean class member to track when an image is being applied to the PictureBox. This prevents race conditions when multiple frames arrive in quick succession: ```cs private bool applyingPictureBoxImage = false; ``` ### Step 2: Initialize Video Settings in the Start Handler When starting your video capture or playback, ensure the flag is properly initialized: ```cs private void btnStart_Click(object sender, EventArgs e) { // Reset the flag before starting capture/playback applyingPictureBoxImage = false; // Your video initialization code here // videoCapture1.Start(); or similar SDK call } ``` ### Step 3: Implement the Frame Handler The core of video rendering is the frame handler. This event fires each time a new video frame is available. Here's how to implement it efficiently: ```cs private void VideoCapture1_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { // Prevent concurrent updates that could cause threading issues if (applyingPictureBoxImage) { return; } applyingPictureBoxImage = true; try { // Store current image for proper disposal var currentImage = pictureBox1.Image; // Create a new bitmap from the frame pictureBox1.Image = new Bitmap(e.Frame); // Properly dispose of the previous image to prevent memory leaks currentImage?.Dispose(); } catch (Exception ex) { // Consider logging the exception Console.WriteLine($"Error updating frame: {ex.Message}"); } finally { // Ensure flag is reset even if an exception occurs applyingPictureBoxImage = false; } } ``` This implementation includes several important concepts: 1. **Thread safety**: Using the `applyingPictureBoxImage` flag prevents concurrent updates. 2. **Memory management**: Properly disposing of the previous image prevents memory leaks. 3. **Exception handling**: Catching exceptions prevents application crashes during rendering. ### Step 4: Implement Cleanup When Stopping Video When stopping video capture or playback, you need to clean up resources properly: ```cs private void btnStop_Click(object sender, EventArgs e) { // Your video stop code here // videoCapture1.Stop(); or similar SDK call // Wait until any in-progress frame updates complete while (applyingPictureBoxImage) { Thread.Sleep(50); } // Clean up resources if (pictureBox1.Image != null) { pictureBox1.Image.Dispose(); pictureBox1.Image = null; } } ``` This cleanup process: 1. Waits for any in-progress frame updates to complete 2. Properly disposes of the image 3. Sets the PictureBox image to null for visual cleanup ## Advanced Implementation Considerations ### Handling High Frame Rates For high-frame-rate video sources, you might want to implement frame skipping to maintain application responsiveness: ```cs private DateTime lastFrameTime = DateTime.MinValue; private TimeSpan frameInterval = TimeSpan.FromMilliseconds(33); // About 30fps private void VideoCapture1_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { // Skip frames if they're coming too quickly if (DateTime.Now - lastFrameTime < frameInterval) { return; } if (applyingPictureBoxImage) { return; } applyingPictureBoxImage = true; lastFrameTime = DateTime.Now; // Frame processing code as before... } ``` ### Cross-Thread Invocation When handling video frames from background threads, you'll need to use cross-thread invocation: ```cs private void VideoCapture1_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { if (applyingPictureBoxImage) { return; } applyingPictureBoxImage = true; if (pictureBox1.InvokeRequired) { pictureBox1.BeginInvoke(new Action(() => { var currentImage = pictureBox1.Image; pictureBox1.Image = new Bitmap(e.Frame); currentImage?.Dispose(); applyingPictureBoxImage = false; })); } else { // Direct update code as before... } } ``` ## Performance Optimization Tips ### Reduce Bitmap Creation Overhead Creating a new Bitmap for each frame can be expensive. Consider reusing Bitmap objects: ```cs private Bitmap displayBitmap; private void VideoCapture1_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { if (applyingPictureBoxImage) { return; } applyingPictureBoxImage = true; try { // Initialize bitmap if needed if (displayBitmap == null || displayBitmap.Width != e.Frame.Width || displayBitmap.Height != e.Frame.Height) { displayBitmap?.Dispose(); displayBitmap = new Bitmap(e.Frame.Width, e.Frame.Height); } // Copy frame to display bitmap using (Graphics g = Graphics.FromImage(displayBitmap)) { g.DrawImage(e.Frame, 0, 0, e.Frame.Width, e.Frame.Height); } // Update display var oldImage = pictureBox1.Image; pictureBox1.Image = displayBitmap; oldImage?.Dispose(); } finally { applyingPictureBoxImage = false; } } ``` ### Consider Using Double Buffering For smoother display, enable double buffering on your form: ```cs // In your form constructor this.DoubleBuffered = true; ``` ## Troubleshooting Common Issues ### Memory Leaks If your application experiences increasing memory usage, check: - Proper disposal of old Bitmap objects - References to frames that might prevent garbage collection - Whether frames are being skipped when necessary ### Flickering Display If video display flickers: - Ensure double buffering is enabled - Check if multiple threads are updating the PictureBox simultaneously - Consider implementing a more sophisticated frame synchronization mechanism ### High CPU Usage If rendering causes high CPU usage: - Implement frame skipping as shown above - Consider reducing the frame rate of the source if possible - Optimize bitmap handling to reduce GC pressure ## Required Dependencies To implement this solution, you'll need: - .NET Framework or .NET Core/5+ - SDK redist files for the specific video SDK you're using ## Conclusion Implementing video rendering on a PictureBox control provides a straightforward way to display video in Windows Forms applications. By following the patterns outlined in this guide, you can achieve smooth video display while avoiding common pitfalls like memory leaks, thread safety issues, and performance bottlenecks. Remember that while PictureBox is suitable for many applications, high-performance video applications might benefit from more specialized rendering approaches using DirectX or OpenGL. --- For more code samples, visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) repository. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\exclude-filters.md --- title: Excluding DirectShow Filters in .NET Applications description: Learn how to identify problematic DirectShow filters and exclude them from your multimedia processing pipeline. Comprehensive guide with code examples for .NET developers working with video capture, editing, and playback applications. sidebar_label: Excluding DirectShow Filters --- # Excluding DirectShow Filters in .NET Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction When developing multimedia applications in .NET, you'll frequently interact with DirectShow — Microsoft's framework for multimedia streaming. DirectShow uses a filter-based architecture where individual components (filters) process media data. However, not all filters are created equal. Some can cause performance issues, compatibility problems, or simply don't meet your application's specific needs. This guide explores how to effectively identify and exclude problematic DirectShow filters from your application's processing pipeline. ## Understanding DirectShow Filters DirectShow filters are COM objects that perform specific operations on media data, such as: - **Source filters**: Read media from files, capture devices, or network streams - **Transform filters**: Process or convert media data (decoders, encoders, effects) - **Renderer filters**: Display video or play audio When DirectShow builds a filter graph, it automatically selects filters based on merit (priority) and compatibility. This automatic selection sometimes includes third-party filters that may: - Reduce performance - Cause stability issues - Introduce compatibility problems - Override preferred processing methods ## Common Issues with DirectShow Filters ### Decoder Conflicts Multiple decoders installed on a system can compete to handle the same media formats. For example: - NVIDIA's video decoder might conflict with Intel's hardware decoder - Third-party codec packs might introduce low-quality decoders - Legacy decoders might be selected over newer, more efficient ones ### Performance Bottlenecks Some filters can significantly impact performance: - Non-optimized video processing filters - Filters without hardware acceleration support - Debugging filters that add logging overhead ### Compatibility Problems Not all filters work well together: - Version mismatches between filters - Filters with different pixel format expectations - Non-standard implementation of interfaces ## When to Exclude DirectShow Filters Consider excluding DirectShow filters when: 1. You notice unexplained performance issues during media playback or processing 2. Your application crashes when handling specific media formats 3. Media quality is unexpectedly poor 4. You want to enforce consistent behavior across different user systems 5. You're implementing a custom processing pipeline with specific requirements ## Implementing Filter Exclusion Our .NET SDKs provide a straightforward API for managing DirectShow filter exclusions. ### Clearing the Blacklist Before setting up your exclusion list, you may want to clear any previously blacklisted filters: ```csharp // Clear any existing blacklisted filters videoProcessor.DirectShow_Filters_Blacklist_Clear(); ``` This ensures you're starting with a clean slate and your exclusion list contains only the filters you explicitly specify. ### Adding Filters to the Blacklist To exclude specific filters, you'll use the `DirectShow_Filters_Blacklist_Add` method with the exact filter name: ```csharp // Exclude specific filters by name videoProcessor.DirectShow_Filters_Blacklist_Add("NVIDIA NVENC Encoder"); videoProcessor.DirectShow_Filters_Blacklist_Add("Intel® Hardware H.264 Encoder"); videoProcessor.DirectShow_Filters_Blacklist_Add("Fraunhofer IIS MPEG Audio Layer 3 Decoder"); ``` ### Complete Code Example Here's a more complete example demonstrating filter exclusion in a video processing application: ```csharp using System; using VisioForge.Core.VideoCapture; using VisioForge.Core.VideoEdit; using VisioForge.Core.MediaPlayer; public class FilterExclusionExample { private VideoCaptureCore captureCore; public void SetupFilterExclusions() { captureCore = new VideoCaptureCore(); // Clear any existing blacklisted filters captureCore.DirectShow_Filters_Blacklist_Clear(); // Add problematic filters to the blacklist captureCore.DirectShow_Filters_Blacklist_Add("SampleGrabber"); captureCore.DirectShow_Filters_Blacklist_Add("Overlay Mixer"); captureCore.DirectShow_Filters_Blacklist_Add("VirtualDub H.264 Decoder"); Console.WriteLine("DirectShow filters successfully excluded."); } // Additional application logic... } ``` ## Best Practices for Filter Exclusion ### Identify Before Excluding Before blacklisting filters, identify which ones are causing issues: 1. Use DirectShow diagnostic tools like GraphEdit or GraphStudio 2. Enable logging in your application to track which filters are being used 3. Test with different filter configurations to isolate problematic components ### Be Specific with Filter Names Use exact, case-sensitive filter names when excluding: ```csharp // Correct - uses exact filter name videoProcessor.DirectShow_Filters_Blacklist_Add("ffdshow Video Decoder"); // Incorrect - may exclude unintended filters or none at all videoProcessor.DirectShow_Filters_Blacklist_Add("ffdshow"); ``` ### Consider Alternative Approaches Filter exclusion is not always the best solution: - **Merit adjustment**: SDK allows adjusting filter merit instead of complete exclusion - **Explicit graph building**: Build the filter graph manually with preferred filters - **Alternative frameworks**: Consider MediaFoundation for newer applications ## Troubleshooting ### Filter Still Being Used Despite Blacklisting If a filter continues to be used despite being blacklisted: 1. Verify you're using the exact filter name (case-sensitive) 2. Ensure the blacklist is set before building the filter graph 3. Check if the filter is being inserted through an alternative method ### Performance Issues After Blacklisting If performance degrades after blacklisting certain filters: 1. The blacklisted filter might have been providing hardware acceleration 2. The replacement filter might be less efficient 3. The filter graph might be more complex without the excluded filter ### Application Crashes After Filter Exclusion If your application becomes unstable after filter exclusion: 1. Some filters might be required for proper operation 2. The alternative filter path might have compatibility issues 3. The filter graph might be incomplete without certain filters ## Conclusion Excluding problematic DirectShow filters provides a powerful tool for optimizing and stabilizing your multimedia applications. By carefully identifying and blacklisting problematic filters, you can ensure consistent behavior, better performance, and higher quality media processing across different user systems. Remember to test thoroughly after implementing filter exclusions, as the DirectShow filter graph may behave differently when certain components are unavailable. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples and implementation examples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\image-onvideoframebuffer.md --- title: Drawing Images with OnVideoFrameBuffer in .NET description: Learn how to implement image drawing using the OnVideoFrameBuffer event in .NET applications. This step-by-step guide with C# code samples shows you how to efficiently overlay images on video frames in real-time for video processing applications. sidebar_label: Drawing Images with OnVideoFrameBuffer --- # Drawing Images with OnVideoFrameBuffer in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction The `OnVideoFrameBuffer` event provides a powerful way to manipulate video frames in real-time. This guide demonstrates how to overlay images on video frames using this event in .NET applications. This technique is useful for adding watermarks, logos, or other visual elements to video content during processing or playback. ## Understanding the Process When working with video frames in .NET, you need to: 1. Load your image (logo, watermark, etc.) into memory 2. Convert the image to a compatible buffer format 3. Listen for the `OnVideoFrameBuffer` event 4. Draw the image onto each video frame as it's processed 5. Update the frame data to display the changes ## Code Implementation Let's walk through the implementation step by step: ### Step 1: Load Your Image First, load the image file you want to overlay on the video: ```cs // Bitmap loading from file private Bitmap logoImage = new Bitmap(@"logo24.jpg"); // You can also use PNG with alpha channel for transparency //private Bitmap logoImage = new Bitmap(@"logo32.png"); ``` ### Step 2: Prepare Memory Buffers Initialize pointers for the image buffer: ```cs // Logo RGB24/RGB32 buffer private IntPtr logoImageBuffer = IntPtr.Zero; private int logoImageBufferSize = 0; ``` ### Step 3: Implement the OnVideoFrameBuffer Event Handler The full event handler implementation: ```cs private void VideoCapture1_OnVideoFrameBuffer(Object sender, VideoFrameBufferEventArgs e) { // Create logo buffer if not allocated or have zero size if (logoImageBuffer == IntPtr.Zero || logoImageBufferSize == 0) { if (logoImageBuffer == IntPtr.Zero) { if (logoImage.PixelFormat == PixelFormat.Format32bppArgb) { logoImageBufferSize = ImageHelper.GetStrideRGB32(logoImage.Width) * logoImage.Height; logoImageBuffer = Marshal.AllocCoTaskMem(logoImageBufferSize); } else { logoImageBufferSize = ImageHelper.GetStrideRGB24(logoImage.Width) * logoImage.Height; logoImageBuffer = Marshal.AllocCoTaskMem(logoImageBufferSize); } } else { if (logoImage.PixelFormat == PixelFormat.Format32bppArgb) { logoImageBufferSize = ImageHelper.GetStrideRGB32(logoImage.Width) * logoImage.Height; Marshal.FreeCoTaskMem(logoImageBuffer); logoImageBuffer = Marshal.AllocCoTaskMem(logoImageBufferSize); } else { logoImageBufferSize = ImageHelper.GetStrideRGB24(logoImage.Width) * logoImage.Height; Marshal.FreeCoTaskMem(logoImageBuffer); logoImageBuffer = Marshal.AllocCoTaskMem(logoImageBufferSize); } } if (logoImage.PixelFormat == PixelFormat.Format32bppArgb) { ImageHelper.BitmapToIntPtr(logoImage, logoImageBuffer, logoImage.Width, logoImage.Height, PixelFormat.Format32bppArgb); } else { ImageHelper.BitmapToIntPtr(logoImage, logoImageBuffer, logoImage.Width, logoImage.Height, PixelFormat.Format24bppRgb); } } // Draw image if (logoImage.PixelFormat == PixelFormat.Format32bppArgb) { FastImageProcessing.Draw_RGB32OnRGB24(logoImageBuffer, logoImage.Width, logoImage.Height, e.Frame.Data, e.Frame.Width, e.Frame.Height, 0, 0); } else { FastImageProcessing.Draw_RGB24OnRGB24(logoImageBuffer, logoImage.Width, logoImage.Height, e.Frame.Data, e.Frame.Width, e.Frame.Height, 0, 0); } e.UpdateData = true; } ``` ## Detailed Explanation ### Memory Management The code handles both 24-bit and 32-bit image formats. Here's what happens: 1. **Buffer Initialization Check**: The code first checks if the logo buffer needs to be created or recreated. 2. **Format Detection**: It determines whether to use RGB24 or RGB32 format based on the loaded image: - RGB24: Standard 24-bit color (8 bits each for R, G, B) - RGB32: 32-bit color with alpha channel for transparency (8 bits each for R, G, B, A) 3. **Memory Allocation**: Allocates unmanaged memory using `Marshal.AllocCoTaskMem()` to store the image data. 4. **Image Conversion**: Converts the Bitmap to raw pixel data in the allocated buffer using `ImageHelper.BitmapToIntPtr()`. ### Drawing Process Once the buffer is prepared, drawing takes place: 1. **Format-Specific Drawing**: The code selects the appropriate drawing method based on the image format: - `FastImageProcessing.Draw_RGB32OnRGB24()` for 32-bit images with transparency - `FastImageProcessing.Draw_RGB24OnRGB24()` for standard 24-bit images 2. **Position Parameters**: The `0, 0` parameters specify where to draw the image (top-left corner in this example). 3. **Frame Update**: Setting `e.UpdateData = true` ensures the modified frame data is used for display or further processing. ## Best Practices for Image Overlay For optimal performance when overlaying images on video frames: 1. **Memory Management**: Always free allocated memory when it's no longer needed to prevent memory leaks. 2. **Buffer Reuse**: Create the buffer once and reuse it for subsequent frames rather than recreating it for each frame. 3. **Image Size Considerations**: Use appropriately sized images; overlaying large images can impact performance. 4. **Format Selection**: - Use PNG (RGB32) when you need transparency - Use JPG (RGB24) when transparency isn't required (more efficient) 5. **Position Calculation**: For dynamic positioning, calculate coordinates based on frame dimensions: ```cs // Example: Position logo at bottom-right corner with 10px padding int xPos = e.Frame.Width - logoImage.Width - 10; int yPos = e.Frame.Height - logoImage.Height - 10; ``` ## Error Handling When implementing this functionality, consider adding error handling: ```cs try { // Your existing implementation } catch (OutOfMemoryException ex) { // Handle memory allocation failures Console.WriteLine("Failed to allocate memory: " + ex.Message); } catch (Exception ex) { // Handle other exceptions Console.WriteLine("Error during frame processing: " + ex.Message); } finally { // Optional cleanup code } ``` ## Performance Optimization For high-performance applications, consider these optimizations: 1. **Buffer Pre-allocation**: Initialize buffers during application startup rather than during video processing. 2. **Conditional Processing**: Only process frames that need the overlay (e.g., skip processing for certain frames). 3. **Parallel Processing**: For complex operations, consider using parallel processing techniques. ## Conclusion The `OnVideoFrameBuffer` event provides a powerful way to manipulate video frames in real-time. By following this guide, you can efficiently overlay images on video content for watermarking, branding, or visual enhancement purposes. The technique demonstrated here works across multiple SDK products and can be adapted for various video processing scenarios in your .NET applications. --- Looking for more code samples? Visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) for additional examples and resources. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\index.md --- title: Essential .NET SDK Code Samples for Developers description: Practical implementation examples for DirectShow filters, audio/video processing, rendering techniques, and media manipulation in .NET applications - designed to accelerate your development workflow. sidebar_label: Code Samples order: -4 --- # .NET SDK Code Samples: Practical Implementation Guide In this guide, you'll find a collection of practical code samples and implementation techniques for working with our .NET SDKs. These examples address common development scenarios and demonstrate how to leverage our libraries effectively for media processing applications. ## DirectShow Filter Implementation DirectShow provides a powerful framework for handling multimedia streams. Our SDKs simplify working with these components through well-designed interfaces and helper methods. ### Media Indexing and Format Handling - [ASF and WMV Files Indexing](asf-wmv-files-indexing.md) - Learn techniques for properly indexing Windows Media formats to enable seeking and efficient playback position control. This sample demonstrates how to establish accurate navigation points within media files and handle large ASF/WMV content effectively. ### Custom Filter Integration - [Custom DirectShow Filter Interface Usage](custom-filter-interface.md) - This tutorial walks through the process of implementing and connecting custom DirectShow filters within your application. You'll learn how to create filter interfaces that integrate seamlessly with the existing DirectShow architecture while adding your own specialized functionality. ### Third-Party Integration - [Integrating Third-Party Video Processing Filters](3rd-party-video-effects.md) - Discover how to incorporate external video processing components into your DirectShow filter graph. This example demonstrates proper filter registration, connection methods, and parameter configuration for third-party video effects and transformations. ### Filter Management - [Manual DirectShow Filter Uninstallation](uninstall-directshow-filter.md) - This guide explains the registry entries, COM object registration, and system directories involved in completely removing DirectShow filters when standard uninstallation isn't sufficient or available. - [Excluding Specific DirectShow Filters](exclude-filters.md) - Learn techniques for selectively bypassing certain DirectShow filters in your filter graph construction. This sample shows how to exclude specific decoders, encoders, or processing filters while maintaining proper media handling. ## Audio and Video Processing Techniques Manipulating audio and video streams is a core requirement for many media applications. These samples demonstrate different approaches to accessing and modifying media data. ### Real-time Video Effects - [Custom Video Effects Using Frame Events](custom-video-effects.md) - Learn two powerful approaches for implementing real-time video effects through the OnVideoFrameBitmap and OnVideoFrameBuffer events. This comprehensive sample demonstrates how to access video frames, apply effects, and optimize performance. ### Advanced Overlay Techniques - [Multi-text Overlay Drawing](draw-multitext-onvideoframebuffer.md) - This sample demonstrates techniques for rendering multiple text elements on video frames with precise positioning and style control. You'll learn how to handle text formatting, alpha blending, and performance optimization. - [Text Overlay Implementation](text-onvideoframebuffer.md) - A focused tutorial on adding dynamic text annotations to video content. This example covers font selection, positioning, and real-time updates of overlay text. - [Image Overlay Integration](image-onvideoframebuffer.md) - Learn how to composite images onto video frames with proper scaling, alpha blending, and positioning. This example shows techniques for watermarking, logo placement, and dynamic image overlays. ### Video Transformation - [Manual Zoom Effect Implementation](zoom-onvideoframebuffer.md) - This detailed example demonstrates how to implement a custom zoom functionality by directly manipulating video frame buffers. You'll learn techniques for region selection, scaling algorithms, and smooth transitions between zoom levels. ### Bitmap-Based Frame Processing - [OnVideoFrameBitmap Event Usage](onvideoframebitmap-usage.md) - This guide explores the bitmap-based approach to video frame processing, which offers simplified access to frame data through GDI+ compatible objects. Learn how this differs from buffer-based processing and when to choose each approach. ## Video Rendering Solutions Displaying video content with flexibility and performance requires understanding various rendering techniques. These samples demonstrate different approaches for visual presentation. ### Windows Forms Integration - [PictureBox Video Rendering](draw-video-picturebox.md) - This sample demonstrates how to properly render video content within a standard Windows Forms PictureBox control. You'll learn about frame timing, aspect ratio preservation, and performance considerations. ### Multi-Display Functionality - [Multiple Renderer Zoom Configuration](zoom-video-multiple-renderer.md) - Learn techniques for independently controlling zoom levels across multiple video renderers. This sample is essential for applications requiring synchronized but visually distinct video outputs. - [WPF Multi-screen Video Output](multiple-screens-wpf.md) - This example shows how to implement multiple independent video display surfaces within a WPF application. You'll learn proper control initialization, resource management, and synchronization techniques. ### Renderer Selection and Customization - [Video Renderer Selection (WinForms)](select-video-renderer-winforms.md) - This tutorial explains how to choose and configure the most appropriate video renderer for your Windows Forms application. You'll understand the tradeoffs between EVR, VMR9, and other renderer types. ### User Interaction - [Mouse Wheel Event Integration](mouse-wheel-usage.md) - Learn how to handle mouse wheel events for interactive video displays. This sample demonstrates zoom control, timeline scrubbing, and other wheel-based interactions. - [Custom Image Video View](video-view-set-custom-image.md) - This guide shows how to replace the standard video frame with a custom image for scenarios like connection loss, buffering states, or application-specific messaging. ## Media Information and Visualization These samples demonstrate how to extract information from media files and create useful visualizations. ### File Analysis - [Media File Information Extraction](read-file-info.md) - Learn techniques for reading detailed metadata, stream properties, and format information from media files. This example shows how to access duration, bitrate, codec information, and other essential media properties. ### Audio Visualization - [VU Meter and Waveform Visualization](vu-meters.md) - This comprehensive sample demonstrates how to create real-time audio visualizations including volume unit meters and waveform displays. You'll learn about audio level analysis, drawing techniques, and synchronization with playback. ## Performance Optimization Each sample in this collection is designed with performance considerations in mind. You'll find techniques for efficient buffer handling, memory management, and processing optimizations that help you build responsive media applications, even when working with high-resolution content or applying complex effects. ## Cross-Platform Considerations While focusing on .NET implementations, many of the concepts demonstrated in these samples apply to other platforms as well. Where appropriate, we've noted platform-specific considerations and alternative approaches for cross-platform development scenarios. ## Getting Started To use these examples effectively, we recommend reviewing the appropriate SDK documentation for your specific product version. Each sample includes the necessary references and initialization code, but may require configuration based on your development environment and target platform. These code samples serve as building blocks for your media applications, providing proven implementation patterns that you can adapt and extend for your specific requirements. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\mouse-wheel-usage.md --- title: Implementing Mouse Wheel Events in .NET SDKs description: Learn how to implement mouse wheel events in .NET applications for video processing. This comprehensive guide includes code examples, best practices, troubleshooting tips, and performance optimization techniques for developers. sidebar_label: Mouse Wheel Event Usage --- # Implementing Mouse Wheel Events in .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction to Mouse Wheel Events Mouse wheel events provide an intuitive way for users to interact with video content in multimedia applications. Whether you're developing a video player, editor, or capture application, implementing proper mouse wheel event handling enhances user experience by allowing smooth zooming, scrolling, or timeline navigation. In .NET applications, the `MouseWheel` event is triggered when the user rotates the mouse wheel. This event provides crucial information about the direction and intensity of the wheel movement through the `MouseEventArgs` parameter. ## Why Implement Mouse Wheel Events? Mouse wheel functionality offers several benefits to your video applications: - **Improved User Experience**: Enables intuitive zoom functionality in video viewers - **Enhanced Navigation**: Allows quick timeline scrubbing in video editors - **Volume Control**: Provides convenient volume adjustment in media players - **Efficient UI Interaction**: Reduces reliance on on-screen controls ## Basic Implementation ### Setting Up Event Handlers To implement mouse wheel functionality in your .NET application, you need to set up three key event handlers: 1. `MouseEnter`: Ensures the control gains focus when the mouse enters 2. `MouseLeave`: Releases focus when the mouse leaves 3. `MouseWheel`: Handles the actual wheel rotation event Here's a basic implementation: ```cs private void VideoView1_MouseEnter(object sender, EventArgs e) { if (!VideoView1.Focused) { VideoView1.Focus(); } } private void VideoView1_MouseLeave(object sender, EventArgs e) { if (VideoView1.Focused) { VideoView1.Parent.Focus(); } } private void VideoView1_MouseWheel(object sender, MouseEventArgs e) { mmLog.Text += "Delta: " + e.Delta + Environment.NewLine; } ``` The `MouseWheel` event handler receives a `MouseEventArgs` parameter that includes the `Delta` property. This value indicates the direction and distance the wheel has rotated: - **Positive Delta**: The wheel rotated forward (away from the user) - **Negative Delta**: The wheel rotated backward (toward the user) - **Delta Magnitude**: Indicates the intensity of the rotation ## Advanced Implementation Techniques ### Implementing Zoom Functionality One common use of the mouse wheel in video applications is to zoom in and out. Here's how you might implement zoom functionality: ```cs private void VideoView1_MouseWheel(object sender, MouseEventArgs e) { // Determine zoom direction based on delta if (e.Delta > 0) { // Zoom in code ZoomIn(0.1); // Increase zoom by 10% } else { // Zoom out code ZoomOut(0.1); // Decrease zoom by 10% } } private void ZoomIn(double factor) { // Implementation depends on your SDK's specific API VideoView1.Zoom = Math.Min(VideoView1.Zoom + factor, 3.0); // Max zoom of 300% } private void ZoomOut(double factor) { // Implementation depends on your SDK's specific API VideoView1.Zoom = Math.Max(VideoView1.Zoom - factor, 0.5); // Min zoom of 50% } ``` ### Timeline Navigation For video editing applications, the mouse wheel can be used to navigate through the timeline: ```cs private void TimelineControl_MouseWheel(object sender, MouseEventArgs e) { // Calculate how much to move based on delta and timeline length double moveFactor = e.Delta / 120.0; // Normalize to increments of 1.0 double moveAmount = moveFactor * 5.0; // 5 seconds per wheel "click" // Move position double newPosition = TimelineControl.CurrentPosition + moveAmount; // Ensure we stay within bounds newPosition = Math.Max(0, Math.Min(newPosition, TimelineControl.Duration)); // Apply the new position TimelineControl.CurrentPosition = newPosition; } ``` ### Volume Control Another common use case is controlling volume in media player applications: ```cs private void VideoView1_MouseWheel(object sender, MouseEventArgs e) { // Calculate volume change based on delta float volumeChange = e.Delta / 120.0f * 0.05f; // 5% per wheel "click" // Apply volume change float newVolume = VideoView1.Volume + volumeChange; // Ensure volume stays within 0-1 range newVolume = Math.Max(0.0f, Math.Min(newVolume, 1.0f)); // Set the new volume VideoView1.Volume = newVolume; // Optional: Display volume indicator ShowVolumeIndicator(newVolume); } ``` ## Handling Focus Management Proper focus management is crucial for mouse wheel events to work correctly. The example code shows a basic implementation, but in more complex applications, you may need a more sophisticated approach: ```cs private void VideoView1_MouseEnter(object sender, EventArgs e) { // Store the previously focused control _previouslyFocused = Form.ActiveControl; // Focus our control VideoView1.Focus(); // Optional: Visual indication that the control has focus VideoView1.BorderStyle = BorderStyle.FixedSingle; } private void VideoView1_MouseLeave(object sender, EventArgs e) { // Return focus to previous control if appropriate if (_previouslyFocused != null && _previouslyFocused.CanFocus) { _previouslyFocused.Focus(); } else { // If no previous control, focus the parent VideoView1.Parent.Focus(); } // Reset visual indication VideoView1.BorderStyle = BorderStyle.None; } ``` ## Performance Considerations When implementing mouse wheel events, consider these performance tips: 1. **Debounce Wheel Events**: Mouse wheels can generate many events in quick succession 2. **Optimize Calculations**: Avoid complex calculations in the wheel event handler 3. **Use Animation**: For smooth zooming, consider using animation rather than abrupt changes Here's an example of debouncing wheel events: ```cs private DateTime _lastWheelEvent = DateTime.MinValue; private const int DebounceMs = 50; private void VideoView1_MouseWheel(object sender, MouseEventArgs e) { // Check if enough time has passed since the last event TimeSpan elapsed = DateTime.Now - _lastWheelEvent; if (elapsed.TotalMilliseconds < DebounceMs) { return; // Ignore event if it's too soon } // Process the wheel event ProcessWheelEvent(e.Delta); // Update the last event time _lastWheelEvent = DateTime.Now; } ``` ## Cross-Platform Considerations If you're developing cross-platform .NET applications, be aware that mouse wheel behavior can vary: - **Windows**: Typically 120 units per "click" - **macOS**: May have different sensitivity settings - **Linux**: Can vary based on distribution and configuration Your code should account for these differences: ```cs private void VideoView1_MouseWheel(object sender, MouseEventArgs e) { // Normalize delta based on platform double normalizedDelta; if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) { normalizedDelta = e.Delta / 120.0; } else if (RuntimeInformation.IsOSPlatform(OSPlatform.OSX)) { normalizedDelta = e.Delta / 100.0; } else { normalizedDelta = e.Delta / 120.0; // Default for Linux and others } // Use normalized delta for calculations ApplyZoom(normalizedDelta); } ``` ## Troubleshooting Common Issues ### Mouse Wheel Events Not Firing If your mouse wheel events aren't firing, check: 1. **Focus Issues**: Ensure the control has focus when the mouse is over it 2. **Event Registration**: Verify the event handler is properly registered 3. **Control Properties**: Some controls need specific properties set to receive wheel events ### Inconsistent Behavior If wheel events behave inconsistently: 1. **Delta Normalization**: Ensure you're properly normalizing delta values 2. **User Settings**: Account for user-specific mouse settings 3. **Hardware Variations**: Different mouse hardware can produce different delta values ## Conclusion Mouse wheel event handling is an essential aspect of creating intuitive and user-friendly video applications. By implementing the techniques outlined in this guide, you can enhance your .NET video applications with smooth, intuitive controls that improve the overall user experience. The implementation can vary depending on your specific requirements, but the core principles remain the same: handle focus properly, normalize wheel delta values, and apply appropriate changes based on user input. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\multiple-screens-wpf.md --- title: Multiple Output Video Screens in WPF Applications description: Learn how to implement multiple video output screens in WPF applications using C# and the Image control. This guide covers event handling, memory management, rendering optimizations, and practical implementation techniques for creating high-performance multi-display video applications. sidebar_label: Multiple Output Video Screens for WPF Controls --- # Implementing Multiple Video Output Screens in WPF Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) When developing WPF applications that require handling multiple video feeds simultaneously, developers often face challenges with performance, synchronization, and resource management. This guide provides a comprehensive approach to implementing multiple video output screens in your WPF applications using C# and the Image control. ## Getting Started with Multiple Video Screens Check the installation guide for WPF [here](../../install/index.md). To begin implementing multiple video outputs in your WPF application, you'll need to: 1. Add the appropriate Video View control to your application 2. Set up event handling for video frame processing 3. Configure your rendering pipeline for optimal performance ### Setting Up Your WPF Project First, place the `VisioForge.Core.UI.WPF.VideoView` control on your WPF window. It's recommended to give this control a descriptive name, such as `videoView`, for clarity in your code. This control will serve as your primary video display element. ### Handling Video Frames The key to creating multiple output screens is proper event handling. You'll need to subscribe to the "OnVideoFrameBuffer" event for your SDK control. This event provides access to the raw video frame data that you can then distribute to multiple display elements. ## Implementing the Video Frame Handler Below is a sample implementation of the video frame handler that captures incoming frames and renders them to a video view: ```cs private void VideoCapture1_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { videoView.RenderFrame(e); } ``` This simple handler receives video frames through the `VideoFrameBufferEventArgs` parameter and passes them to the `RenderFrame` method of your video view control. ## Advanced Implementation Techniques ### Creating Dynamic Video Views For applications requiring a variable number of video outputs, you can dynamically create video view controls: ```cs private List videoViews = new List(); private void CreateVideoView(Grid container, int row, int column) { var videoView = new VisioForge.Core.UI.WPF.VideoView(); Grid.SetRow(videoView, row); Grid.SetColumn(videoView, column); container.Children.Add(videoView); videoViews.Add(videoView); } // Usage example: // CreateVideoView(mainGrid, 0, 0); // CreateVideoView(mainGrid, 0, 1); ``` ### Distributing Video Frames to Multiple Views When working with multiple video views, you need to distribute incoming frames to all active views: ```cs private void VideoCapture1_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { // Render to all video views foreach (var view in videoViews) { view.RenderFrame(e); } } ``` ### Memory Management Considerations When working with multiple video outputs, memory management becomes a critical concern. Video frames can consume significant memory, especially at higher resolutions. Consider implementing a frame pooling mechanism: ```cs private ConcurrentQueue framePool = new ConcurrentQueue(); private const int MaxPoolSize = 10; private VideoFrame GetFrameFromPool() { if (framePool.TryDequeue(out var frame)) { return frame; } return new VideoFrame(); } private void ReturnFrameToPool(VideoFrame frame) { frame.Clear(); if (framePool.Count < MaxPoolSize) { framePool.Enqueue(frame); } } ``` ## Performance Optimization Strategies ### Reducing Render Load For multiple video views, consider these optimization techniques: 1. **Adaptive resolution**: Scale down the resolution for secondary displays 2. **Frame skipping**: Not every view needs to update at full frame rate 3. **Asynchronous rendering**: Offload rendering to background threads ```cs private void VideoCapture1_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { // Primary view gets full resolution, full frame rate primaryVideoView.RenderFrame(e); // Secondary views get every second frame if (frameCounter % 2 == 0) { foreach (var view in secondaryVideoViews) { Task.Run(() => view.RenderFrameScaled(e, 0.5)); // Scaled down by 50% } } frameCounter++; } ``` ## Practical Example: Four-Camera Security System Here's a more complete example of implementing a four-camera security system: ```cs public partial class SecurityMonitorWindow : Window { private List cameraViews = new List(); private List cameras = new List(); public SecurityMonitorWindow() { InitializeComponent(); // Set up 2x2 grid of camera views for (int row = 0; row < 2; row++) { for (int col = 0; col < 2; col++) { var view = new VisioForge.Core.UI.WPF.VideoView(); Grid.SetRow(view, row); Grid.SetColumn(view, col); mainGrid.Children.Add(view); cameraViews.Add(view); // Create and configure camera var camera = new VideoCapture(); camera.OnVideoFrameBuffer += (s, e) => view.RenderFrame(e); cameras.Add(camera); } } } public async Task StartCamerasAsync() { for (int i = 0; i < cameras.Count; i++) { cameras[i].VideoSource = VideoSource.CameraSource; cameras[i].CameraDevice = new CameraDevice(i); // Assuming cameras are indexed 0-3 await cameras[i].StartAsync(); } } } ``` ## Troubleshooting Common Issues ### Handling Frame Synchronization If you experience frame timing issues across multiple displays: ```cs private readonly object syncLock = new object(); private void VideoCapture1_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { lock (syncLock) { foreach (var view in videoViews) { view.RenderFrame(e); } } } ``` --- For more code samples and advanced implementation techniques, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\onvideoframebitmap-usage.md --- title: Mastering OnVideoFrameBitmap in .NET Video Processing description: Learn how to manipulate video frames in real-time with OnVideoFrameBitmap events in .NET applications. This detailed guide provides practical code examples, performance tips, and advanced techniques for C# developers working with video processing in .NET SDK environments. sidebar_label: OnVideoFrameBitmap Event Usage --- # Mastering Real-Time Video Frame Manipulation with OnVideoFrameBitmap [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) The `OnVideoFrameBitmap` event is a powerful feature in .NET video processing libraries that allows developers to access and modify video frames in real-time. This guide explores the practical applications, implementation techniques, and performance considerations when working with bitmap frame manipulation in C# applications. ## Understanding OnVideoFrameBitmap Events The `OnVideoFrameBitmap` event provides a direct interface to access video frames as they're processed by the SDK. This capability is essential for applications that require: - Real-time video analysis - Frame-by-frame manipulation - Dynamic overlay implementation - Custom video effects - Computer vision integration When the event fires, it delivers a bitmap representation of the current video frame, allowing for pixel-level access and manipulation before the frame continues through the processing pipeline. ## Basic Implementation To begin working with the `OnVideoFrameBitmap` event, you'll need to subscribe to it in your code: ```csharp // Subscribe to the OnVideoFrameBitmap event videoProcessor.OnVideoFrameBitmap += VideoProcessor_OnVideoFrameBitmap; // Implement the event handler private void VideoProcessor_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { // Frame manipulation code will go here // e.Frame contains the current frame as a Bitmap } ``` ## Manipulating Video Frames ### Simple Bitmap Overlay Example The following example demonstrates how to overlay an image on each video frame: ```csharp Bitmap bmp = new Bitmap(@"c:\samples\pics\1.jpg"); using (Graphics g = Graphics.FromImage(e.Frame)) { g.DrawImage(bmp, 0, 0, bmp.Width, bmp.Height); e.UpdateData = true; } bmp.Dispose(); ``` In this code: 1. We create a `Bitmap` object from an image file 2. We use the `Graphics` class to draw onto the frame bitmap 3. We set `e.UpdateData = true` to inform the SDK that we've modified the frame 4. We dispose of our resources properly to prevent memory leaks > **Important:** Always set `e.UpdateData = true` when you modify the frame bitmap. This signals the SDK to use your modified frame instead of the original. ### Adding Text Overlays Text overlays are commonly used for timestamps, captions, or informational displays: ```csharp using (Graphics g = Graphics.FromImage(e.Frame)) { // Create a semi-transparent background for text using (SolidBrush brush = new SolidBrush(Color.FromArgb(150, 0, 0, 0))) { g.FillRectangle(brush, 10, 10, 200, 30); } // Add text overlay using (Font font = new Font("Arial", 12)) using (SolidBrush textBrush = new SolidBrush(Color.White)) { g.DrawString(DateTime.Now.ToString(), font, textBrush, new PointF(15, 15)); } e.UpdateData = true; } ``` ## Performance Considerations When working with `OnVideoFrameBitmap`, it's crucial to optimize your code for performance. Each frame processing operation must complete quickly to maintain smooth video playback. ### Resource Management Proper resource management is essential: ```csharp // Poor performance approach private void VideoProcessor_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { Bitmap overlay = new Bitmap(@"c:\logo.png"); Graphics g = Graphics.FromImage(e.Frame); g.DrawImage(overlay, 0, 0); e.UpdateData = true; // Memory leak! Graphics and Bitmap not disposed } // Optimized approach private Bitmap _cachedOverlay; private void InitializeResources() { _cachedOverlay = new Bitmap(@"c:\logo.png"); } private void VideoProcessor_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { using (Graphics g = Graphics.FromImage(e.Frame)) { g.DrawImage(_cachedOverlay, 0, 0); e.UpdateData = true; } } private void CleanupResources() { _cachedOverlay?.Dispose(); } ``` ### Optimizing Processing Time To maintain smooth video playback: 1. **Pre-compute where possible**: Prepare resources before processing begins 2. **Cache frequently used objects**: Avoid creating new objects for each frame 3. **Process only when necessary**: Add conditional logic to skip frames or perform less intensive operations when needed 4. **Use efficient drawing operations**: Choose appropriate GDI+ methods based on your needs ```csharp private void VideoProcessor_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { // Only process every second frame if (_frameCounter % 2 == 0) { using (Graphics g = Graphics.FromImage(e.Frame)) { // Your frame processing code e.UpdateData = true; } } _frameCounter++; } ``` ## Advanced Frame Manipulation Techniques ### Applying Filters and Effects You can implement custom image processing filters: ```csharp private void ApplyGrayscaleFilter(Bitmap bitmap) { Rectangle rect = new Rectangle(0, 0, bitmap.Width, bitmap.Height); BitmapData bmpData = bitmap.LockBits(rect, ImageLockMode.ReadWrite, bitmap.PixelFormat); IntPtr ptr = bmpData.Scan0; int bytes = Math.Abs(bmpData.Stride) * bitmap.Height; byte[] rgbValues = new byte[bytes]; Marshal.Copy(ptr, rgbValues, 0, bytes); // Process pixel data for (int i = 0; i < rgbValues.Length; i += 4) { byte gray = (byte)(0.299 * rgbValues[i + 2] + 0.587 * rgbValues[i + 1] + 0.114 * rgbValues[i]); rgbValues[i] = gray; // Blue rgbValues[i + 1] = gray; // Green rgbValues[i + 2] = gray; // Red } Marshal.Copy(rgbValues, 0, ptr, bytes); bitmap.UnlockBits(bmpData); } ``` ## Integration with Computer Vision Libraries The `OnVideoFrameBitmap` event can be combined with popular computer vision libraries: ```csharp // Example using a hypothetical computer vision library private void VideoProcessor_OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { // Convert bitmap to format needed by CV library byte[] imageData = ConvertBitmapToByteArray(e.Frame); // Process with CV library var results = _computerVisionProcessor.DetectFaces(imageData, e.Frame.Width, e.Frame.Height); // Draw results back onto frame using (Graphics g = Graphics.FromImage(e.Frame)) { foreach (var face in results) { g.DrawRectangle(new Pen(Color.Yellow, 2), face.X, face.Y, face.Width, face.Height); } e.UpdateData = true; } } ``` ## Troubleshooting Common Issues ### Memory Leaks If you experience memory growth during prolonged video processing: 1. Ensure all `Graphics` objects are disposed 2. Properly dispose of any temporary `Bitmap` objects 3. Avoid capturing large objects in lambda expressions ### Performance Degradation If frame processing becomes sluggish: 1. Profile your event handler to identify bottlenecks 2. Consider reducing processing frequency 3. Optimize GDI+ operations or consider DirectX for performance-critical applications ## SDK Integration The `OnVideoFrameBitmap` event is available in the following SDKs: ## Required Dependencies To use the functionality described in this guide, you'll need: - SDK redistribution package - System.Drawing (included in .NET Framework) - Windows GDI+ support --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples and projects demonstrating these techniques in action. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\read-file-info.md --- title: Reading Media File Information in C# for Developers description: Learn how to extract detailed information from video and audio files in C# with step-by-step code examples. Discover how to access codecs, resolution, frame rate, bitrate, and metadata tags for building robust media applications. sidebar_label: Reading Media File Information --- # Reading Media File Information in C# [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction Accessing detailed information embedded within media files is essential for developing sophisticated applications like media players, video editors, content management systems, and file analysis tools. Understanding properties such as codecs, resolution, frame rate, bitrate, duration, and embedded tags allows developers to build more intelligent and user-friendly software. This guide demonstrates how to read comprehensive information from video and audio files using C# and the `MediaInfoReader` class. The techniques shown are applicable across various .NET projects and provide a foundation for handling media files programmatically. ## Why Extract Media File Information? Media file information serves multiple purposes in application development: - **User Experience**: Display technical details to users in media players - **Compatibility Checks**: Verify if files meet required specifications - **Automated Processing**: Configure encoding parameters based on source properties - **Content Organization**: Catalog media libraries with accurate metadata - **Quality Assessment**: Evaluate media files for potential issues ## Implementation Guide Let's explore the process of extracting media file information in a step-by-step approach. The examples assume a WinForms application with a `TextBox` control named `mmInfo` for displaying the extracted information. ### Step 1: Initialize the Media Information Reader The first step involves creating an instance of the `MediaInfoReader` class: ```csharp // Import the necessary namespace using VisioForge.Core.MediaInfo; // Namespace for MediaInfoReader using VisioForge.Core.Helpers; // Namespace for TagLibHelper (optional) // Create an instance of MediaInfoReader var infoReader = new MediaInfoReader(); ``` This initialization prepares the reader to process media files. ### Step 2: Verify File Playability (Optional) Before diving into detailed analysis, it's often useful to check if the file is supported: ```csharp // Define variables to hold potential error information FilePlaybackError errorCode; string errorText; // Specify the path to the media file string filename = @"C:\path\to\your\mediafile.mp4"; // Replace with your actual file path // Check if the file is playable if (MediaInfoReader.IsFilePlayable(filename, out errorCode, out errorText)) { // Display success message mmInfo.Text += "Status: This file appears to be playable." + Environment.NewLine; } else { // Display error message including the error code and description mmInfo.Text += $"Status: This file might not be playable. Error: {errorCode} - {errorText}" + Environment.NewLine; } mmInfo.Text += "------------------------------------" + Environment.NewLine; ``` This verification provides early feedback on file integrity and compatibility. ### Step 3: Extract Detailed Stream Information Now we can extract the rich metadata from the file: ```csharp try { // Assign the filename to the reader infoReader.Filename = filename; // Read the file information (true for full analysis) infoReader.ReadFileInfo(true); // Process Video Streams mmInfo.Text += $"Found {infoReader.VideoStreams.Count} video stream(s)." + Environment.NewLine; for (int i = 0; i < infoReader.VideoStreams.Count; i++) { var stream = infoReader.VideoStreams[i]; mmInfo.Text += Environment.NewLine; mmInfo.Text += $"--- Video Stream #{i + 1} ---" + Environment.NewLine; mmInfo.Text += $" Codec: {stream.Codec}" + Environment.NewLine; mmInfo.Text += $" Duration: {stream.Duration}" + Environment.NewLine; mmInfo.Text += $" Dimensions: {stream.Width}x{stream.Height}" + Environment.NewLine; mmInfo.Text += $" FOURCC: {stream.FourCC}" + Environment.NewLine; if (stream.AspectRatio != null && stream.AspectRatio.Item1 > 0 && stream.AspectRatio.Item2 > 0) { mmInfo.Text += $" Aspect Ratio: {stream.AspectRatio.Item1}:{stream.AspectRatio.Item2}" + Environment.NewLine; } mmInfo.Text += $" Frame Rate: {stream.FrameRate:F2} fps" + Environment.NewLine; mmInfo.Text += $" Bitrate: {stream.Bitrate / 1000.0:F0} kbps" + Environment.NewLine; mmInfo.Text += $" Frames Count: {stream.FramesCount}" + Environment.NewLine; } // Process Audio Streams mmInfo.Text += Environment.NewLine; mmInfo.Text += $"Found {infoReader.AudioStreams.Count} audio stream(s)." + Environment.NewLine; for (int i = 0; i < infoReader.AudioStreams.Count; i++) { var stream = infoReader.AudioStreams[i]; mmInfo.Text += Environment.NewLine; mmInfo.Text += $"--- Audio Stream #{i + 1} ---" + Environment.NewLine; mmInfo.Text += $" Codec: {stream.Codec}" + Environment.NewLine; mmInfo.Text += $" Codec Info: {stream.CodecInfo}" + Environment.NewLine; mmInfo.Text += $" Duration: {stream.Duration}" + Environment.NewLine; mmInfo.Text += $" Bitrate: {stream.Bitrate / 1000.0:F0} kbps" + Environment.NewLine; mmInfo.Text += $" Channels: {stream.Channels}" + Environment.NewLine; mmInfo.Text += $" Sample Rate: {stream.SampleRate} Hz" + Environment.NewLine; mmInfo.Text += $" Bits Per Sample (BPS): {stream.BPS}" + Environment.NewLine; mmInfo.Text += $" Language: {stream.Language}" + Environment.NewLine; } // Process Subtitle Streams mmInfo.Text += Environment.NewLine; mmInfo.Text += $"Found {infoReader.Subtitles.Count} subtitle stream(s)." + Environment.NewLine; for (int i = 0; i < infoReader.Subtitles.Count; i++) { var stream = infoReader.Subtitles[i]; mmInfo.Text += Environment.NewLine; mmInfo.Text += $"--- Subtitle Stream #{i + 1} ---" + Environment.NewLine; mmInfo.Text += $" Codec/Format: {stream.Codec}" + Environment.NewLine; mmInfo.Text += $" Name: {stream.Name}" + Environment.NewLine; mmInfo.Text += $" Language: {stream.Language}" + Environment.NewLine; } } catch (Exception ex) { // Handle potential errors during file reading mmInfo.Text += $"{Environment.NewLine}Error reading file info: {ex.Message}{Environment.NewLine}"; } finally { // Important: Dispose the reader to release file handles and resources infoReader.Dispose(); } ``` The code iterates through each collection (`VideoStreams`, `AudioStreams`, and `Subtitles`), extracting and displaying relevant information for every stream found. ### Step 4: Extract Metadata Tags Beyond technical stream information, media files often contain metadata tags: ```csharp // Read Metadata Tags mmInfo.Text += Environment.NewLine + "--- Metadata Tags ---" + Environment.NewLine; try { // Use TagLibHelper to read tags from the file var tags = TagLibHelper.ReadTags(filename); // Check if tags were successfully read if (tags != null) { mmInfo.Text += $"Title: {tags.Title}" + Environment.NewLine; mmInfo.Text += $"Artist(s): {string.Join(", ", tags.Performers ?? new string[0])}" + Environment.NewLine; mmInfo.Text += $"Album: {tags.Album}" + Environment.NewLine; mmInfo.Text += $"Year: {tags.Year}" + Environment.NewLine; mmInfo.Text += $"Genre: {string.Join(", ", tags.Genres ?? new string[0])}" + Environment.NewLine; mmInfo.Text += $"Comment: {tags.Comment}" + Environment.NewLine; } else { mmInfo.Text += "No standard metadata tags found or readable." + Environment.NewLine; } } catch (Exception ex) { // Handle errors during tag reading mmInfo.Text += $"Error reading tags: {ex.Message}" + Environment.NewLine; } ``` ## Best Practices for Media File Analysis When implementing media file analysis in your applications, consider these best practices: ### Error Handling Always wrap file operations in appropriate try-catch blocks. Media files can be corrupted, inaccessible, or in unexpected formats, which might cause exceptions. ```csharp try { // Media file operations } catch (Exception ex) { // Log error and provide user feedback } ``` ### Resource Management Properly dispose of objects that access file resources to prevent file locking issues: ```csharp using (var infoReader = new MediaInfoReader()) { // Use the reader } // Or manually in a finally block try { // Operations } finally { infoReader.Dispose(); } ``` ### Performance Considerations For large media libraries, consider: 1. Implementing caching mechanisms for repeated analysis 2. Using background threads for processing to keep UI responsive 3. Limiting the depth of analysis for initial quick scans ## Required Components For successful implementation, ensure your project includes the necessary dependencies as specified in the SDK documentation. ## Conclusion Extracting information from media files is a powerful capability for developers building applications that work with audio and video content. With the techniques outlined in this guide, you can access detailed technical properties and metadata tags to enhance your application's functionality. The `MediaInfoReader` class provides a convenient and efficient way to extract the necessary metadata, allowing you to build more sophisticated media handling features in your C# applications. For more advanced scenarios, explore the full capabilities of the SDK and consult the detailed documentation. You can find additional code samples and examples on GitHub to further expand your media file processing capabilities. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\select-video-renderer-winforms.md --- title: Video Renderer Selection Guide for .NET Applications description: Learn how to implement and optimize video renderers in .NET applications using DirectShow-based SDK engines. This in-depth guide covers VideoRenderer, VMR9, and EVR with practical code examples for WinForms development. sidebar_label: Select Video Renderer (WinForms) --- # Video Renderer Selection Guide for WinForms Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction to Video Rendering in .NET When developing multimedia applications in .NET, selecting the appropriate video renderer is crucial for optimal performance and compatibility. This guide focuses on DirectShow-based SDK engines: VideoCaptureCore, VideoEditCore, and MediaPlayerCore, which share the same API across all SDKs. Video renderers serve as the bridge between your application and the display hardware, determining how video content is processed and presented to the user. The right choice can significantly impact performance, visual quality, and hardware resource utilization. ## Understanding Available Video Renderer Options DirectShow in Windows offers three primary renderer options, each with distinct characteristics and use cases. Let's explore each renderer in detail to help you make an informed decision for your application. ### Legacy Video Renderer (GDI-based) The Video Renderer is the oldest option in the DirectShow ecosystem. It relies on GDI (Graphics Device Interface) for drawing operations. **Key characteristics:** - Software-based rendering without hardware acceleration - Compatible with older systems and configurations - Lower performance ceiling compared to modern alternatives - Simple implementation with minimal configuration options **Implementation example:** ```cs VideoCapture1.Video_Renderer.VideoRenderer = VideoRendererMode.VideoRenderer; ``` **When to use:** - Compatibility is the primary concern - Application targets older hardware or operating systems - Minimal video processing requirements - Troubleshooting issues with newer renderers ### Video Mixing Renderer 9 (VMR9) VMR9 represents a significant improvement over the legacy renderer, introducing support for hardware acceleration and advanced features. **Key characteristics:** - Hardware-accelerated rendering through DirectX 9 - Support for multiple video streams mixing - Advanced deinterlacing options - Alpha blending and compositing capabilities - Custom video effects processing **Implementation example:** ```cs VideoCapture1.Video_Renderer.VideoRenderer = VideoRendererMode.VMR9; ``` **When to use:** - Modern applications requiring good performance - Video editing or composition features are needed - Multiple video stream scenarios - Applications that need to balance performance and compatibility ### Enhanced Video Renderer (EVR) EVR is the most advanced option, available in Windows Vista and later operating systems. It leverages the Media Foundation framework rather than pure DirectShow. **Key characteristics:** - Latest hardware acceleration technologies - Superior video quality and performance - Enhanced color space processing - Better multi-monitor support - More efficient CPU usage - Improved synchronization mechanisms **Implementation example:** ```cs VideoCapture1.Video_Renderer.VideoRenderer = VideoRendererMode.EVR; ``` **When to use:** - Modern applications targeting Windows Vista or later - Maximum performance and quality are required - Applications handling HD or 4K content - When advanced synchronization is important - Multiple display environments ## Advanced Configuration Options Beyond just selecting a renderer, the SDK provides various configuration options to fine-tune video presentation. ### Working with Deinterlacing Modes When displaying interlaced video content (common in broadcast sources), proper deinterlacing improves visual quality significantly. The SDK supports various deinterlacing algorithms depending on the renderer chosen. First, retrieve the available deinterlacing modes: ```cs VideoCapture1.Video_Renderer_Deinterlace_Modes_Fill(); // Populate a dropdown with available modes foreach (string deinterlaceMode in VideoCapture1.Video_Renderer_Deinterlace_Modes()) { cbDeinterlaceModes.Items.Add(deinterlaceMode); } ``` Then apply a selected deinterlacing mode: ```cs // Assuming the user selected a mode from cbDeinterlaceModes string selectedMode = cbDeinterlaceModes.SelectedItem.ToString(); VideoCapture1.Video_Renderer.DeinterlaceMode = selectedMode; VideoCapture1.Video_Renderer_Update(); ``` VMR9 and EVR support various deinterlacing algorithms including: - Bob (simple line doubling) - Weave (field interleaving) - Motion adaptive - Motion compensated (highest quality) The availability of specific algorithms depends on the video card capabilities and driver implementation. ### Managing Aspect Ratio and Stretch Modes When displaying video in a window or control that doesn't match the source's native aspect ratio, you need to decide how to handle this discrepancy. The SDK provides multiple stretch modes to address different scenarios. #### Stretch Mode This mode stretches the video to fill the entire display area, potentially distorting the image: ```cs VideoCapture1.Video_Renderer.StretchMode = VideoRendererStretchMode.Stretch; VideoCapture1.Video_Renderer_Update(); ``` **Use cases:** - When aspect ratio is not critical - Filling the entire display area is more important than proportions - Source and display have similar aspect ratios - User interface constraints require full-area usage #### Letterbox Mode This mode preserves the original aspect ratio by adding black borders as needed: ```cs VideoCapture1.Video_Renderer.StretchMode = VideoRendererStretchMode.Letterbox; VideoCapture1.Video_Renderer_Update(); ``` **Use cases:** - Maintaining correct proportions is essential - Professional video applications - Content where distortion would be noticeable or problematic - Cinema or broadcast content viewing #### Crop Mode This mode fills the display area while preserving aspect ratio, potentially cropping some content: ```cs VideoCapture1.Video_Renderer.StretchMode = VideoRendererStretchMode.Crop; VideoCapture1.Video_Renderer_Update(); ``` **Use cases:** - Consumer video applications where filling the screen is preferred - Content where edges are less important than center - Social media-style video display - When trying to eliminate letterboxing in already letterboxed content ### Performance Optimization Techniques #### Adjusting Buffer Count For smoother playback, especially with high-resolution content, adjusting the buffer count can help: ```cs // Increase buffer count for smoother playback VideoCapture1.Video_Renderer.BuffersCount = 3; VideoCapture1.Video_Renderer_Update(); ``` #### Enabling Hardware Acceleration Ensure hardware acceleration is enabled for maximum performance: ```cs // For VMR9 VideoCapture1.Video_Renderer.VMR9.UseOverlays = true; VideoCapture1.Video_Renderer.VMR9.UseDynamicTextures = true; // For EVR VideoCapture1.Video_Renderer.EVR.EnableHardwareTransforms = true; VideoCapture1.Video_Renderer_Update(); ``` ## Troubleshooting Common Issues ### Renderer Compatibility Problems If you encounter issues with a specific renderer, try falling back to a more compatible option: ```cs try { // Try using EVR first VideoCapture1.Video_Renderer.VideoRenderer = VideoRendererMode.EVR; VideoCapture1.Video_Renderer_Update(); } catch { try { // Fall back to VMR9 VideoCapture1.Video_Renderer.VideoRenderer = VideoRendererMode.VMR9; VideoCapture1.Video_Renderer_Update(); } catch { // Last resort - legacy renderer VideoCapture1.Video_Renderer.VideoRenderer = VideoRendererMode.VideoRenderer; VideoCapture1.Video_Renderer_Update(); } } ``` ### Display Issues on Multi-Monitor Systems For applications that might run on multi-monitor setups, additional configuration might be necessary: ```cs // Specify which monitor to use for full-screen mode VideoCapture1.Video_Renderer.MonitorIndex = 0; // Primary monitor VideoCapture1.Video_Renderer_Update(); ``` ## Best Practices and Recommendations 1. **Choose the right renderer for your target environment**: - For modern Windows: EVR - For broad compatibility: VMR9 - For legacy systems: Video Renderer 2. **Test on various hardware configurations**: Video rendering can behave differently across GPU vendors and driver versions. 3. **Implement renderer fallback logic**: Always have a backup plan if the preferred renderer fails. 4. **Consider your video content**: Higher resolution or interlaced content will benefit more from advanced renderers. 5. **Balance quality vs. performance**: The highest quality settings might not always deliver the best user experience if they impact performance. ## Required Dependencies To ensure proper functionality of these renderers, make sure to include: - SDK redistributable packages - DirectX End-User Runtime (latest version recommended) - .NET Framework runtime appropriate for your application ## Conclusion Selecting and configuring the right video renderer is an important decision in developing high-quality multimedia applications. By understanding the strengths and limitations of each renderer option, you can significantly improve the user experience of your WinForms applications. The optimal choice depends on your specific requirements, target audience, and the nature of your video content. In most modern applications, EVR should be your first choice, with VMR9 as a reliable fallback option. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\text-onvideoframebuffer.md --- title: Text Overlay Implementation with OnVideoFrameBuffer description: Learn how to create custom text overlays in video applications using the OnVideoFrameBuffer event in .NET video processing. This detailed guide with C# code examples shows you how to implement dynamic text elements on video frames for professional applications. sidebar_label: Draw Text Overlay Using OnVideoFrameBuffer Event --- # Creating Custom Text Overlays with OnVideoFrameBuffer in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction to Text Overlays in Video Processing Adding text overlays to video content is a common requirement in many professional applications, from video editing software to security camera feeds, broadcasting tools, and educational applications. While the standard video effect APIs provide basic text overlay capabilities, developers often need more control over how text appears on video frames. This guide demonstrates how to manually implement custom text overlays using the OnVideoFrameBuffer event available in VideoCaptureCore, VideoEditCore, and MediaPlayerCore engines. By intercepting video frames during processing, you can apply custom text and graphics with precise control over positioning, formatting, and animation. ## Understanding the OnVideoFrameBuffer Event The OnVideoFrameBuffer event is a powerful hook that gives developers direct access to the video frame buffer during processing. This event fires for each frame of video, providing an opportunity to modify the frame data before it's displayed or encoded. Key benefits of using OnVideoFrameBuffer for text overlays include: - **Frame-level access**: Modify individual frames with pixel-perfect precision - **Dynamic content**: Update text based on real-time data or timestamps - **Custom styling**: Apply custom fonts, colors, and effects beyond what built-in APIs offer - **Performance optimizations**: Implement efficient rendering techniques for high-performance applications ## Implementation Overview The technique presented here uses the following components: 1. An event handler for OnVideoFrameBuffer that processes each video frame 2. A VideoEffectTextLogo object to define text properties 3. The FastImageProcessing API to render text onto the frame buffer This approach is particularly useful when you need to: - Display dynamic data like timestamps, metadata, or sensor readings - Create animated text effects - Position text with pixel-perfect accuracy - Apply custom styling not available through standard APIs ## Sample Code Implementation The following C# example demonstrates how to implement a basic text overlay system using the OnVideoFrameBuffer event: ```cs private void SDK_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { if (!logoInitiated) { logoInitiated = true; InitTextLogo(); } FastImageProcessing.AddTextLogo(null, e.Frame.Data, e.Frame.Width, e.Frame.Height, ref textLogo, e.Timestamp, 0); } private bool logoInitiated = false; private VideoEffectTextLogo textLogo = null; private void InitTextLogo() { textLogo = new VideoEffectTextLogo(true); textLogo.Text = "Hello world!"; textLogo.Left = 50; textLogo.Top = 50; } ``` ## Detailed Code Explanation Let's break down the key components of this implementation: ### The Event Handler ```cs private void SDK_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) ``` This method is triggered for each video frame. The VideoFrameBufferEventArgs provides access to: - Frame data (pixel buffer) - Frame dimensions (width and height) - Timestamp information ### Initialization Logic ```cs if (!logoInitiated) { logoInitiated = true; InitTextLogo(); } ``` This code ensures the text logo is only initialized once, preventing unnecessary object creation for each frame. This pattern is important for performance when processing video at high frame rates. ### Text Logo Setup ```cs private void InitTextLogo() { textLogo = new VideoEffectTextLogo(true); textLogo.Text = "Hello world!"; textLogo.Left = 50; textLogo.Top = 50; } ``` The VideoEffectTextLogo class is used to define the properties of the text overlay: - The text content ("Hello world!") - Position coordinates (50 pixels from both left and top) ### Rendering the Text Overlay ```cs FastImageProcessing.AddTextLogo(null, e.Frame.Data, e.Frame.Width, e.Frame.Height, ref textLogo, e.Timestamp, 0); ``` This line does the actual work of rendering the text onto the frame: - It takes the frame data buffer as input - Uses the frame dimensions to properly position the text - References the textLogo object containing text properties - Can utilize the timestamp for dynamic content ## Advanced Customization Options While the basic example demonstrates a simple static text overlay, the VideoEffectTextLogo class supports numerous customization options: ### Text Formatting ```cs textLogo.FontName = "Arial"; textLogo.FontSize = 24; textLogo.FontBold = true; textLogo.FontItalic = false; textLogo.Color = System.Drawing.Color.White; textLogo.Opacity = 0.8f; ``` ### Background and Borders ```cs textLogo.BackgroundEnabled = true; textLogo.BackgroundColor = System.Drawing.Color.Black; textLogo.BackgroundOpacity = 0.5f; textLogo.BorderEnabled = true; textLogo.BorderColor = System.Drawing.Color.Yellow; textLogo.BorderThickness = 2; ``` ### Animation and Dynamic Content For dynamic content that changes per frame: ```cs private void SDK_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { if (!logoInitiated) { logoInitiated = true; InitTextLogo(); } // Update text based on timestamp textLogo.Text = $"Timestamp: {e.Timestamp.ToString("HH:mm:ss.fff")}"; // Animate position textLogo.Left = 50 + (int)(Math.Sin(e.Timestamp.TotalSeconds) * 50); FastImageProcessing.AddTextLogo(null, e.Frame.Data, e.Frame.Width, e.Frame.Height, ref textLogo, e.Timestamp, 0); } ``` ## Performance Considerations When implementing custom text overlays, consider these performance best practices: 1. **Initialize objects once**: Create the VideoEffectTextLogo object only once, not per frame 2. **Minimize text changes**: Update text content only when necessary 3. **Use efficient fonts**: Simple fonts render faster than complex ones 4. **Consider resolution**: Higher resolution videos require more processing power 5. **Test on target hardware**: Ensure your implementation performs well on production systems ## Multiple Text Elements To display multiple text elements on the same frame: ```cs private VideoEffectTextLogo titleLogo = null; private VideoEffectTextLogo timestampLogo = null; private void InitTextLogos() { titleLogo = new VideoEffectTextLogo(true); titleLogo.Text = "Camera Feed"; titleLogo.Left = 50; titleLogo.Top = 50; timestampLogo = new VideoEffectTextLogo(true); timestampLogo.Left = 50; timestampLogo.Top = 100; } private void SDK_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { if (!logosInitiated) { logosInitiated = true; InitTextLogos(); } // Update dynamic content timestampLogo.Text = e.Timestamp.ToString("yyyy-MM-dd HH:mm:ss.fff"); // Render both text elements FastImageProcessing.AddTextLogo(null, e.Frame.Data, e.Frame.Width, e.Frame.Height, ref titleLogo, e.Timestamp, 0); FastImageProcessing.AddTextLogo(null, e.Frame.Data, e.Frame.Width, e.Frame.Height, ref timestampLogo, e.Timestamp, 0); } ``` ## Required Components To implement this solution, you'll need: - SDK redist package installed in your application - Reference to the appropriate SDK (.NET Video Capture, Video Edit, or Media Player) - Basic understanding of video frame processing concepts ## Conclusion The OnVideoFrameBuffer event provides a powerful mechanism for implementing custom text overlays in video applications. By directly accessing the frame buffer, developers can create sophisticated text effects with precise control over appearance and behavior. This approach is particularly valuable when standard text overlay APIs don't provide the flexibility or features required for your application. With the techniques demonstrated in this guide, you can implement professional-quality text overlays for a wide range of video processing scenarios. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\uninstall-directshow-filter.md --- title: Uninstalling DirectShow Filters in Windows Applications description: Learn how to properly uninstall DirectShow filters from your system using multiple methods. This guide explains manual uninstallation techniques, troubleshooting steps, and best practices for .NET developers working with multimedia applications. sidebar_label: Uninstall DirectShow Filters --- # Uninstalling DirectShow Filters in Windows Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) DirectShow filters are essential components for multimedia applications in Windows environments. They enable software to process audio and video data efficiently. However, there may be situations where you need to uninstall these filters, such as when upgrading your application, resolving conflicts, or completely removing a software package. This guide provides detailed instructions on how to properly uninstall DirectShow filters from your system. ## Understanding DirectShow Filters DirectShow is a multimedia framework and API designed by Microsoft for software developers to perform various operations with media files. It's built on the Component Object Model (COM) architecture and uses a modular approach where each processing step is handled by a separate component called a filter. Filters are categorized into three main types: - **Source filters**: Read data from files, capture devices, or network streams - **Transform filters**: Process or modify the data (compression, decompression, effects) - **Rendering filters**: Display video or play audio When SDK components are installed, they register DirectShow filters in the Windows Registry, making them available to any application that uses the DirectShow framework. ## Why Uninstall DirectShow Filters? There are several reasons why you might need to uninstall DirectShow filters: 1. **Version conflicts**: Newer versions of the SDK might require removing older filters 2. **System cleanup**: Removing unused components to maintain system efficiency 3. **Troubleshooting**: Resolving issues with multimedia applications 4. **Complete software removal**: Ensuring no components remain after uninstalling the main application 5. **Re-registration**: Sometimes uninstalling and reinstalling filters can resolve registration issues ## Methods for Uninstalling DirectShow Filters ### Method 1: Using the SDK Installer (Recommended) The most straightforward way to uninstall DirectShow filters is through the SDK (or redist) installer itself. SDK packages include uninstallation routines that properly remove all components, including DirectShow filters. ### Method 2: Manual Unregistration with regsvr32 If automatic uninstallation isn't possible or you need to unregister specific filters, you can use the `regsvr32` command-line tool: 1. Open Command Prompt as Administrator (right-click on Command Prompt and select "Run as administrator") 2. Use the following command syntax to unregister a filter: ```cmd regsvr32 /u "C:\path\to\filter.dll" ``` 3. Replace `C:\path\to\filter.dll` with the actual path to the DirectShow filter file 4. Press Enter to execute the command For example, to unregister a filter located at `C:\Program Files\Common Files\FilterFolder\example_filter.dll`, you would use: ```cmd regsvr32 /u "C:\Program Files\Common Files\FilterFolder\example_filter.dll" ``` You should see a confirmation dialog indicating successful unregistration. ## Finding DirectShow Filter Locations Before you can manually unregister filters, you need to know their locations. Here are several methods to find installed DirectShow filters: ### Using GraphStudio [GraphStudio](https://github.com/cplussharp/graph-studio-next) is a powerful open-source tool for working with DirectShow filters. To find filter locations: 1. Download and install GraphStudio 2. Launch the application with administrator privileges 3. Go to "Graph > Insert Filters" 4. Browse through the list of installed filters 5. Right-click on a filter and select "Properties" 6. Note the "File:" path shown in the properties dialog This method provides the exact file path needed for manual unregistration. ### Using System Registry You can also find DirectShow filters through the Windows Registry: 1. Press `Win + R` to open the Run dialog 2. Type `regedit` and press Enter to open Registry Editor 3. Navigate to `HKEY_CLASSES_ROOT\CLSID` 4. Use the Search function (Ctrl+F) to find filter names 5. Look for the "InprocServer32" key under the filter's CLSID, which contains the file path ## Platform Considerations (x86 vs x64) DirectShow filters are platform-specific, meaning 32-bit (x86) and 64-bit (x64) versions are separate components. If you've installed both versions, you need to unregister each one separately. For x64 systems: - 64-bit filters are typically installed in `C:\Windows\System32` - 32-bit filters are typically installed in `C:\Windows\SysWOW64` Use the appropriate version of `regsvr32` for each platform: - For 64-bit filters: `C:\Windows\System32\regsvr32.exe` - For 32-bit filters: `C:\Windows\SysWOW64\regsvr32.exe` ## Troubleshooting Filter Uninstallation If you encounter issues during filter uninstallation, try these troubleshooting steps: ### Unable to Unregister Filter If you receive an error like "DllUnregisterServer failed with error code 0x80004005": 1. Ensure you're running Command Prompt as Administrator 2. Verify that the path to the filter is correct 3. Check if the filter file exists and isn't in use by any application 4. Close any applications that might be using DirectShow filters 5. In some cases, a system restart may be necessary before unregistration ### Filter Still Present After Unregistration If a filter appears to be still registered after attempting to unregister it: 1. Use GraphStudio to check if the filter is still listed 2. Look for multiple instances of the filter in different locations 3. Check both 32-bit and 64-bit registry locations 4. Try using the Microsoft-provided tool "OleView" to inspect COM registrations ## Verifying Successful Uninstallation After uninstalling DirectShow filters, verify the removal was successful: 1. Use GraphStudio to check if the filters no longer appear in the available filters list 2. Check the registry for any remaining entries related to the filters 3. Test any applications that previously used the filters to ensure they handle the absence gracefully --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples and implementation examples for working with DirectShow and multimedia applications in .NET. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\video-view-set-custom-image.md --- title: Setting Custom Images for VideoView in .NET SDKs description: Learn how to implement custom image display in VideoView controls when no video is playing. This detailed guide includes code examples, troubleshooting tips, and best practices for .NET developers working with video display components. sidebar_label: Setting Custom Image for VideoView --- # Setting Custom Images for VideoView Controls in .NET Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction When developing media applications in .NET, it's often necessary to display a custom image within your VideoView control when no video content is playing. This capability is essential for creating professional-looking applications that maintain visual appeal during inactive states. Custom images can serve as placeholders, branding opportunities, or informational displays to enhance the user experience. This guide explores the implementation of custom image functionality for VideoView controls across various .NET SDK applications. ## Understanding VideoView Custom Images The VideoView control is a versatile component that displays video content in your application. However, when the control is not actively playing video, it typically shows a blank or default display. By implementing custom images, you can: - Display your application or company logo - Show preview thumbnails of available content - Present instructional information to users - Maintain visual consistency across your application - Indicate the video's status (paused, stopped, loading, etc.) It's important to note that the custom image is only visible when the control is not playing any video content. Once playback begins, the video stream automatically replaces the custom image. ## Implementation Process The process of setting a custom image for a VideoView control involves three primary operations: 1. Creating a picture box with appropriate dimensions 2. Setting the desired image 3. Cleaning up resources when no longer needed Let's explore each of these steps in detail. ## Step 1: Creating the Picture Box The first step is to initialize a picture box within your VideoView control with the appropriate dimensions. This operation should be performed once during the setup phase: ```csharp VideoView1.PictureBoxCreate(VideoView1.Width, VideoView1.Height); ``` This method call creates an internal picture box component that will host your custom image. The parameters specify the width and height of the picture box, which should typically match the dimensions of your VideoView control to ensure proper display without stretching or distortion. ### Best Practices for Picture Box Creation - **Timing Considerations**: Create the picture box during form initialization or after the control has been sized appropriately - **Dynamic Sizing**: If your application supports resizing, consider recreating the picture box when the control size changes - **Error Handling**: Implement try-catch blocks to handle potential exceptions during creation ## Step 2: Setting the Custom Image After creating the picture box, you can set your custom image. Note that there appears to be a duplication in the original documentation - the correct code for setting the image should use the `PictureBoxSetImage` method: ```csharp // Load an image from a file Image customImage = Image.FromFile("path/to/your/image.jpg"); VideoView1.PictureBoxSetImage(customImage); ``` Alternatively, you can use built-in resources or dynamically generated images: ```csharp // Using a resource image VideoView1.PictureBoxSetImage(Properties.Resources.MyCustomImage); // Or creating a dynamic image using (Bitmap dynamicImage = new Bitmap(VideoView1.Width, VideoView1.Height)) { using (Graphics g = Graphics.FromImage(dynamicImage)) { // Draw on the image g.Clear(Color.DarkBlue); g.DrawString("Ready to Play", new Font("Arial", 24), Brushes.White, new PointF(50, 50)); } VideoView1.PictureBoxSetImage(dynamicImage.Clone() as Image); } ``` ### Image Format Considerations The image format you choose can impact performance and visual quality: - **PNG**: Best for images with transparency - **JPEG**: Suitable for photographic content - **BMP**: Uncompressed format with higher memory usage - **GIF**: Supports simple animations but with limited color depth ### Image Size Optimization For optimal performance, consider these factors when preparing your custom images: 1. **Match Dimensions**: Resize your image to match the VideoView dimensions to avoid scaling operations 2. **Resolution Awareness**: Consider display DPI for crisp images on high-resolution displays 3. **Memory Consumption**: Large images consume more memory, which may impact application performance ## Step 3: Cleaning Up Resources When the custom image is no longer required, it's important to clean up the resources to prevent memory leaks: ```csharp VideoView1.PictureBoxDestroy(); ``` This method should be called when: - The application is closing - The control is being disposed - You're switching to video playback mode and won't need the custom image anymore ### Resource Management Best Practices Proper resource management is crucial for maintaining application stability: - **Explicit Cleanup**: Always call `PictureBoxDestroy()` when you're done with the custom image - **Disposal Timing**: Include the cleanup call in your form's `Dispose` or `Closing` events - **State Tracking**: Keep track of whether a picture box has been created to avoid destroying a non-existent resource ## Advanced Scenarios ### Dynamic Image Updates In some applications, you may need to update the custom image dynamically: ```csharp private void UpdateCustomImage(string imagePath) { // Ensure picture box exists if (VideoView1.PictureBoxExists()) { // Update image Image newImage = Image.FromFile(imagePath); VideoView1.PictureBoxSetImage(newImage); } else { // Create picture box first VideoView1.PictureBoxCreate(VideoView1.Width, VideoView1.Height); Image newImage = Image.FromFile(imagePath); VideoView1.PictureBoxSetImage(newImage); } } ``` ### Handling Control Resizing If your application allows resizing of the VideoView control, you'll need to handle image scaling: ```csharp private void VideoView1_SizeChanged(object sender, EventArgs e) { // Recreate picture box with new dimensions if (VideoView1.PictureBoxExists()) { VideoView1.PictureBoxDestroy(); } VideoView1.PictureBoxCreate(VideoView1.Width, VideoView1.Height); // Set image again with appropriate scaling SetScaledCustomImage(); } ``` ### Multiple VideoView Controls When working with multiple VideoView controls, ensure proper management for each: ```csharp private void InitializeAllVideoViews() { // Initialize each VideoView with appropriate custom images VideoView1.PictureBoxCreate(VideoView1.Width, VideoView1.Height); VideoView1.PictureBoxSetImage(Properties.Resources.Camera1Placeholder); VideoView2.PictureBoxCreate(VideoView2.Width, VideoView2.Height); VideoView2.PictureBoxSetImage(Properties.Resources.Camera2Placeholder); // Additional VideoView controls... } ``` ## Troubleshooting Common Issues ### Image Not Displaying If your custom image isn't appearing: 1. **Check Timing**: Ensure you're setting the image after the picture box is created 2. **Verify Video State**: Confirm the control isn't currently playing video 3. **Image Loading**: Verify the image path is correct and accessible 4. **Control Visibility**: Ensure the VideoView control is visible in the UI ### Memory Leaks To prevent memory leaks: 1. **Dispose Images**: Always dispose Image objects after they're no longer needed 2. **Destroy Picture Box**: Call `PictureBoxDestroy()` when appropriate 3. **Resource Tracking**: Implement proper tracking of created resources ## Complete Implementation Example Here's a complete implementation example that demonstrates the proper lifecycle management: ```csharp public partial class VideoPlayerForm : Form { private bool isPictureBoxCreated = false; public VideoPlayerForm() { InitializeComponent(); this.Load += VideoPlayerForm_Load; this.FormClosing += VideoPlayerForm_FormClosing; } private void VideoPlayerForm_Load(object sender, EventArgs e) { InitializeCustomImage(); } private void InitializeCustomImage() { try { VideoView1.PictureBoxCreate(VideoView1.Width, VideoView1.Height); isPictureBoxCreated = true; using (Image customImage = Properties.Resources.VideoPlaceholder) { VideoView1.PictureBoxSetImage(customImage); } } catch (Exception ex) { // Handle exceptions MessageBox.Show($"Error setting custom image: {ex.Message}"); } } private void btnPlay_Click(object sender, EventArgs e) { // Play video logic here // The custom image will automatically be replaced during playback } private void VideoPlayerForm_FormClosing(object sender, FormClosingEventArgs e) { CleanupResources(); } private void CleanupResources() { if (isPictureBoxCreated) { VideoView1.PictureBoxDestroy(); isPictureBoxCreated = false; } } } ``` ## Conclusion Implementing custom images for VideoView controls enhances the user experience and professional appearance of your .NET media applications. By following the steps outlined in this guide, you can effectively display branded or informative content when videos aren't playing. Remember the key points: 1. Create the picture box with the appropriate dimensions 2. Set your custom image with proper resource management 3. Clean up resources when they're no longer needed 4. Handle resizing and other special scenarios as required With these techniques, you can create more polished and user-friendly video applications in .NET. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples and implementation examples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\vu-meters.md --- title: Implementing Audio VU Meters & Waveform Visualizers description: Complete guide to implementing VU meters and waveform visualizers in .NET applications. Learn how to display real-time audio levels with WinForms and WPF, including code examples for mono and stereo channel visualization. sidebar_label: VU Meter and Waveform Painter --- # Audio Visualization: Implementing VU Meters and Waveform Displays in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) Audio visualization is a crucial component of modern media applications, providing users with visual feedback about audio levels and waveform patterns. This guide demonstrates how to implement VU (Volume Unit) meters and waveform visualizers in both WinForms and WPF applications. ## Understanding Audio Visualization Components Before diving into implementation, it's important to understand the two main visualization tools we'll be working with: ### VU Meters VU meters display the instantaneous audio level of a signal, typically showing how loud the audio is at any given moment. They provide real-time feedback about audio levels, helping users monitor signal strength and prevent distortion or clipping. ### Waveform Painters Waveform visualizers display the audio signal as a continuous line that represents amplitude changes over time. They provide a more detailed representation of the audio content, showing patterns and characteristics that might not be apparent from listening alone. ## Implementation in WinForms Applications WinForms provides a straightforward way to implement audio visualization components with minimal code. Let's explore the implementation of both VU meters and waveform painters. ### WinForms VU Meter Implementation Implementing a VU meter in WinForms requires just a few steps: 1. **Add the VU Meter Control**: First, add the VU meter control to your form. For stereo audio, you'll typically add two controls—one for each channel. ```cs // Add this to your form design VisioForge.Core.UI.WinForms.VolumeMeterPro.VolumeMeter volumeMeter1; VisioForge.Core.UI.WinForms.VolumeMeterPro.VolumeMeter volumeMeter2; // For stereo ``` 2. **Enable VU Meter in Your Media Control**: Before starting playback or capture, enable the VU meter functionality in your media control. ```cs // Enable VU meter before starting playback/capture mediaPlayer.Audio_VUMeterPro_Enabled = true; ``` 3. **Implement the Event Handler**: Add an event handler to process the audio level data and update the VU meter display. ```cs private void VideoCapture1_OnAudioVUMeterProVolume(object sender, AudioLevelEventArgs e) { volumeMeter1.Amplitude = e.ChannelLevelsDb[0]; if (e.ChannelLevelsDb.Length > 1) { volumeMeter2.Amplitude = e.ChannelLevelsDb[1]; } } ``` With these steps, your VU meter will dynamically update based on the audio levels of your media playback or capture. ### WinForms Waveform Painter Implementation The waveform painter implementation follows a similar pattern: 1. **Add the Waveform Painter Control**: Add the waveform painter control to your form. For stereo audio, add two controls. ```cs // Add this to your form design VisioForge.Core.UI.WinForms.VolumeMeterPro.WaveformPainter waveformPainter1; VisioForge.Core.UI.WinForms.VolumeMeterPro.WaveformPainter waveformPainter2; // For stereo ``` 2. **Enable VU Meter Processing**: Enable the VU meter functionality to provide data for the waveform painter. ```cs // Enable VU meter before starting playback/capture mediaPlayer.Audio_VUMeter_Pro_Enabled = true; ``` 3. **Implement the Event Handler**: Add an event handler to process the audio data and update the waveform display. ```cs private void VideoCapture1_OnAudioVUMeterProVolume(object sender, AudioLevelEventArgs e) { waveformPainter1.AddMax(e.ChannelLevelsDb[0]); if (e.ChannelLevelsDb.Length > 1) { waveformPainter2.AddMax(e.ChannelLevelsDb[1]); } } ``` ## Implementation in WPF Applications WPF requires a slightly different approach due to its threading model and UI framework. Let's look at how to implement both visualization types in WPF. ### WPF VU Meter Implementation 1. **Add the VU Meter Control**: Add the VU meter control to your XAML layout. For stereo audio, add two controls. ```xml ``` 2. **Enable VU Meter Processing and Start the Meters**: ```cs VideoCapture1.Audio_VUMeter_Pro_Enabled = true; volumeMeter1.Start(); volumeMeter2.Start(); ``` 3. **Implement the Event Handler with Dispatcher**: In WPF, you need to use the Dispatcher to update UI elements from non-UI threads. ```cs private delegate void AudioVUMeterProVolumeDelegate(AudioLevelEventArgs e); private void AudioVUMeterProVolumeDelegateMethod(AudioLevelEventArgs e) { volumeMeter1.Amplitude = e.ChannelLevelsDb[0]; volumeMeter1.Update(); if (e.ChannelLevelsDb.Length > 1) { volumeMeter2.Amplitude = e.ChannelLevelsDb[1]; volumeMeter2.Update(); } } private void VideoCapture1_OnAudioVUMeterProVolume(object sender, AudioLevelEventArgs e) { Dispatcher.BeginInvoke(new AudioVUMeterProVolumeDelegate(AudioVUMeterProVolumeDelegateMethod), e); } ``` 4. **Clean Up After Playback**: When playback stops, clean up the VU meters to release resources. ```cs volumeMeter1.Stop(); volumeMeter1.Clear(); volumeMeter2.Stop(); volumeMeter2.Clear(); ``` ### WPF Waveform Painter Implementation 1. **Add the Waveform Painter Control**: Add the waveform painter control to your XAML layout. ```xml ``` 2. **Enable VU Meter Processing and Start the Waveform Painter**: ```cs VideoCapture1.Audio_VUMeter_Pro_Enabled = true; waveformPainter.Start(); ``` 3. **Implement the Maximum Calculated Event Handler**: For waveform painters in WPF, we use a different event. ```cs private delegate void AudioVUMeterProMaximumCalculatedDelegate(VUMeterMaxSampleEventArgs e); private void AudioVUMeterProMaximumCalculatedelegateMethod(VUMeterMaxSampleEventArgs e) { waveformPainter.AddValue(e.MaxSample, e.MinSample); } private void VideoCapture1_OnAudioVUMeterProMaximumCalculated(object sender, VUMeterMaxSampleEventArgs e) { Dispatcher.BeginInvoke(new AudioVUMeterProMaximumCalculatedDelegate(AudioVUMeterProMaximumCalculatedelegateMethod), e); } ``` 4. **Clean Up After Playback**: When playback stops, clean up the waveform painter. ```cs waveformPainter.Stop(); waveformPainter.Clear(); ``` ## Advanced Customization Options Both the VU meter and waveform painter controls offer extensive customization options to match your application's design and user experience requirements. ### Customizing VU Meters You can customize various aspects of the VU meter appearance: - **Color Scheme**: Modify the colors used for different audio levels (low, medium, high) - **Response Time**: Adjust how quickly the meter responds to level changes - **Scale**: Configure the decibel scale and range - **Orientation**: Set horizontal or vertical orientation Example of customizing a VU meter: ```cs volumeMeter1.PeakHoldTime = 500; // Hold peak for 500ms volumeMeter1.ColorNormal = Color.Green; volumeMeter1.ColorWarning = Color.Yellow; volumeMeter1.ColorAlert = Color.Red; volumeMeter1.WarningThreshold = -12; // dB volumeMeter1.AlertThreshold = -6; // dB ``` ### Customizing Waveform Painters Waveform painters can be customized to provide different visual representations: - **Line Thickness**: Adjust the thickness of the waveform line - **Color Gradient**: Apply color gradients based on amplitude - **Time Scale**: Modify how much time is represented in the visible area - **Rendering Mode**: Choose between different rendering styles (line, filled, etc.) Example of customizing a waveform painter: ```cs waveformPainter.LineColor = Color.SkyBlue; waveformPainter.BackColor = Color.Black; waveformPainter.LineThickness = 2; waveformPainter.ScrollingSpeed = 50; waveformPainter.RenderMode = WaveformRenderMode.FilledLine; ``` ## Performance Considerations When implementing audio visualization, consider these performance tips: 1. **Update Frequency**: Balance visual responsiveness with CPU usage by adjusting how frequently you update the visuals 2. **UI Thread Management**: Always update UI elements on the appropriate thread (especially important in WPF) 3. **Resource Cleanup**: Properly stop and clear visualization controls when not in use 4. **Buffering**: Consider implementing buffering for smoother visualization during high CPU usage ## Conclusion Implementing VU meters and waveform painters adds valuable visual feedback to media applications. Whether you're developing in WinForms or WPF, these audio visualization components help users monitor and understand audio levels and patterns more intuitively. By following the implementation steps outlined in this guide, you can enhance your .NET media applications with professional-quality audio visualization features that improve the overall user experience. --- For more code examples and related SDKs, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\zoom-onvideoframebuffer.md --- title: Implementing Custom Zoom Effects in .NET Video Apps description: Learn how to create custom zoom effects using the OnVideoFrameBuffer event in .NET video applications. This step-by-step guide provides detailed C# code examples, implementation techniques, and best practices for video frame manipulation in your applications. sidebar_label: Custom Zoom Implementation with OnVideoFrameBuffer --- # Implementing Custom Zoom Effects with OnVideoFrameBuffer in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction Implementing custom zoom effects in video applications is a common requirement for developers working with video processing. This guide explains how to manually create zoom functionality in your .NET video applications using the OnVideoFrameBuffer event. This technique works across multiple SDK platforms, including Video Capture, Media Player, and Video Edit SDKs. ## Understanding the OnVideoFrameBuffer Event The OnVideoFrameBuffer event is a powerful feature that gives developers direct access to video frame data during playback or processing. By handling this event, you can: - Access raw frame data in real-time - Apply custom modifications to individual frames - Implement visual effects like zooming, rotation, or color adjustments - Control video quality and performance ## Implementation Steps The process of implementing a zoom effect involves several key steps: 1. Allocating memory for temporary buffers 2. Handling the OnVideoFrameBuffer event 3. Applying the zoom transformation to each frame 4. Managing memory to prevent leaks Let's break down each of these steps with detailed explanations. ## Memory Management for Frame Processing When working with video frames, proper memory management is critical. You'll need to allocate sufficient memory to handle frame data and temporary processing buffers. ```cs private IntPtr tempBuffer = IntPtr.Zero; IntPtr tmpZoomFrameBuffer = IntPtr.Zero; private int tmpZoomFrameBufferSize = 0; ``` These fields serve the following purposes: - `tempBuffer`: Stores the processed frame data - `tmpZoomFrameBuffer`: Holds the intermediary zoom calculation results - `tmpZoomFrameBufferSize`: Tracks the required size for the zoom buffer ## Detailed Code Implementation Below is a complete implementation of the zoom effect using the OnVideoFrameBuffer event in a Media Player SDK .NET application: ```cs private IntPtr tempBuffer = IntPtr.Zero; IntPtr tmpZoomFrameBuffer = IntPtr.Zero; private int tmpZoomFrameBufferSize = 0; private void MediaPlayer1_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { // Initialize the temporary buffer if it hasn't been created yet if (tempBuffer == IntPtr.Zero) { tempBuffer = Marshal.AllocCoTaskMem(e.Frame.DataSize); } // Set the zoom factor (2.0 = 200% zoom) const double zoom = 2.0; // Apply the zoom effect using the FastImageProcessing utility FastImageProcessing.EffectZoom( e.Frame.Data, // Source frame data e.Frame.Width, // Frame width e.Frame.Height, // Frame height tempBuffer, // Output buffer zoom, // Horizontal zoom factor zoom, // Vertical zoom factor 0, // Center X coordinate (0 = center) 0, // Center Y coordinate (0 = center) tmpZoomFrameBuffer, // Intermediate buffer ref tmpZoomFrameBufferSize); // Buffer size reference // Allocate the zoom frame buffer if needed and return to process in next frame if (tmpZoomFrameBufferSize > 0 && tmpZoomFrameBuffer == IntPtr.Zero) { tmpZoomFrameBuffer = Marshal.AllocCoTaskMem(tmpZoomFrameBufferSize); return; } // Copy the processed data back to the frame buffer FastImageProcessing.CopyMemory(tempBuffer, e.Frame.Data, e.Frame.DataSize); } ``` ## Customizing the Zoom Effect The code above uses a fixed zoom factor of 2.0 (200%), but you can modify this to create various zoom effects: ### Dynamic Zoom Levels You can implement user-controlled zoom by replacing the constant zoom value with a variable: ```cs // Replace this: const double zoom = 2.0; // With something like this: double zoom = this.userZoomSlider.Value; // Get zoom value from UI control ``` ### Zoom with Focus Point The `EffectZoom` method accepts X and Y coordinates to set the center point of the zoom. Setting these to non-zero values allows you to focus the zoom on specific areas: ```cs // Zoom centered on the top-right quadrant FastImageProcessing.EffectZoom( e.Frame.Data, e.Frame.Width, e.Frame.Height, tempBuffer, zoom, zoom, e.Frame.Width / 4, // X offset from center -e.Frame.Height / 4, // Y offset from center tmpZoomFrameBuffer, ref tmpZoomFrameBufferSize); ``` ## Performance Considerations When implementing custom video effects like zooming, consider these performance tips: 1. **Memory Management**: Always free allocated memory when your application closes to prevent leaks 2. **Buffer Reuse**: Reuse buffers when possible rather than reallocating for each frame 3. **Processing Time**: Keep processing time minimal to maintain smooth video playback 4. **Resolution Impact**: Higher resolution videos require more processing power for real-time effects ## Cleaning Up Resources To properly clean up resources when your application closes, implement a cleanup method: ```cs private void CleanupZoomResources() { if (tempBuffer != IntPtr.Zero) { Marshal.FreeCoTaskMem(tempBuffer); tempBuffer = IntPtr.Zero; } if (tmpZoomFrameBuffer != IntPtr.Zero) { Marshal.FreeCoTaskMem(tmpZoomFrameBuffer); tmpZoomFrameBuffer = IntPtr.Zero; } } ``` Call this method when your form or application closes to prevent memory leaks. ## Troubleshooting Common Issues When implementing the zoom effect, you might encounter these issues: 1. **Distorted Image**: Check that your zoom factors for width and height are equal for uniform scaling 2. **Blank Frames**: Ensure proper memory allocation and buffer sizes 3. **Poor Performance**: Consider reducing the frame processing complexity or the video resolution 4. **Memory Errors**: Verify that all memory is properly allocated and freed ## Conclusion Implementing custom zoom effects using the OnVideoFrameBuffer event gives you precise control over video appearance in your .NET applications. By following the techniques outlined in this guide, you can create sophisticated zoom functionality that enhances the user experience in your video applications. Remember to properly manage memory resources and optimize for performance to ensure smooth playback with your custom effects. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\code-samples\zoom-video-multiple-renderer.md --- title: Setting Zoom Parameters for Multiple Video Renderers description: Learn how to configure zoom settings for multiple video renderers in .NET applications. This guide provides detailed code samples, implementation tips, and best practices for optimizing video display across multiple screens. sidebar_label: Configuring Zoom for Multiple Video Renderers --- # Configuring Zoom Settings for Multiple Video Renderers in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) When developing multimedia applications that utilize multiple video renderers, controlling the zoom and position parameters independently for each display is essential for creating professional-quality user interfaces. This guide covers the implementation details, parameter configurations, and best practices for setting up multiple video renderers with customized zoom settings in your .NET applications. ## Understanding Multiple Renderer Configurations Multiple renderer support (also known as multiscreen functionality) allows your application to display video content across different display areas simultaneously. Each renderer can be configured with its own: - Zoom ratio (magnification level) - Horizontal shift (X-axis positioning) - Vertical shift (Y-axis positioning) This capability is particularly valuable for applications such as: - Video surveillance systems displaying multiple camera feeds - Media production software with preview and output windows - Medical imaging applications requiring different zoom levels for analysis - Multi-display kiosk systems with synchronized content ## Implementing the MultiScreen_SetZoom Method The SDK provides the `MultiScreen_SetZoom` method which takes four key parameters: 1. **Screen Index** (zero-based): Identifies which renderer to configure 2. **Zoom Ratio**: Controls the magnification percentage 3. **Shift X**: Adjusts the horizontal positioning (pixels or percentage) 4. **Shift Y**: Adjusts the vertical positioning (pixels or percentage) ### Method Signature and Parameters ```cs // Method signature void MultiScreen_SetZoom(int screenIndex, int zoomRatio, int shiftX, int shiftY); ``` | Parameter | Description | Valid Range | |-----------|-------------|-------------| | screenIndex | Zero-based index of the target renderer | 0 to (number of renderers - 1) | | zoomRatio | Magnification percentage | 1 to 1000 (%) | | shiftX | Horizontal offset | -1000 to 1000 | | shiftY | Vertical offset | -1000 to 1000 | ## Code Sample: Configuring Multiple Renderers The following example demonstrates how to set different zoom and positioning values for three separate renderers: ```cs // Configure the primary renderer (index 0) // 50% zoom with no horizontal or vertical shift VideoCapture1.MultiScreen_SetZoom(0, 50, 0, 0); // Configure the secondary renderer (index 1) // 20% zoom with slight horizontal and vertical shift VideoCapture1.MultiScreen_SetZoom(1, 20, 10, 20); // Configure the tertiary renderer (index 2) // 30% zoom with no horizontal shift but significant vertical shift VideoCapture1.MultiScreen_SetZoom(2, 30, 0, 30); ``` ## Best Practices for Multiple Renderer Management When implementing multiple renderer configurations, consider these best practices: ### 1. Initialize All Renderers Before Setting Zoom Always ensure that all renderers are properly initialized before applying zoom settings: ```cs // Initialize multiple renderers VideoCapture1.MultiScreen_Enabled = true; // Add 3 renderers VideoCapture1.MultiScreen_AddScreen(videoView1, 1280, 720); VideoCapture1.MultiScreen_AddScreen(videoView2, 1920, 1080); VideoCapture1.MultiScreen_AddScreen(videoView3, 1280, 720); // Now safe to configure zoom settings VideoCapture1.MultiScreen_SetZoom(0, 50, 0, 0); VideoCapture1.MultiScreen_SetZoom(1, 20, 10, 20); VideoCapture1.MultiScreen_SetZoom(2, 30, 0, 30); // Additional configurations... ``` ### 2. Handle Resolution Changes Appropriately When the input source resolution changes, you may need to recalculate zoom values: ```cs private void VideoCapture1_OnVideoSourceResolutionChanged(object sender, EventArgs e) { // Recalculate and apply zoom settings based on new resolution int newZoom = CalculateOptimalZoom(VideoCapture1.VideoSource_ResolutionX, VideoCapture1.VideoSource_ResolutionY); // Apply to all renderers for (int i = 0; i < VideoCapture1.MultiScreen_Count; i++) { VideoCapture1.MultiScreen_SetZoom(i, newZoom, 0, 0); } } ``` ### 3. Provide User Controls for Zoom Adjustment For interactive applications, consider implementing UI controls that allow users to adjust zoom settings: ```cs private void zoomTrackBar_ValueChanged(object sender, EventArgs e) { int selectedRenderer = rendererComboBox.SelectedIndex; int zoomValue = zoomTrackBar.Value; int shiftX = horizontalShiftTrackBar.Value; int shiftY = verticalShiftTrackBar.Value; // Apply new zoom settings to selected renderer VideoCapture1.MultiScreen_SetZoom(selectedRenderer, zoomValue, shiftX, shiftY); } ``` ## Advanced Zoom Configurations ### Dynamic Zoom Transitions For smooth zoom transitions, consider implementing gradual zoom changes: ```cs async Task AnimateZoomAsync(int screenIndex, int startZoom, int targetZoom, int duration) { int steps = 30; // Number of animation steps int delay = duration / steps; // Milliseconds between steps for (int i = 0; i <= steps; i++) { // Calculate intermediate zoom value int currentZoom = startZoom + ((targetZoom - startZoom) * i / steps); // Apply current zoom value VideoCapture1.MultiScreen_SetZoom(screenIndex, currentZoom, 0, 0); // Wait for next step await Task.Delay(delay); } } // Usage await AnimateZoomAsync(0, 50, 100, 1000); // Animate from 50% to 100% over 1 second ``` ## Optimizing Performance with Multiple Renderers When working with multiple renderers, be mindful of performance implications: 1. **Limit Frequent Updates**: Avoid rapidly changing zoom settings as it can impact performance 2. **Consider Hardware Acceleration**: Enable hardware acceleration when available 3. **Monitor Memory Usage**: Multiple high-resolution renderers can consume significant memory ```cs // Enable hardware acceleration for better performance VideoCapture1.Video_Renderer = VideoRendererType.EVR; VideoCapture1.Video_Renderer_EVR_Mode = EVRMode.Optimal; ``` ## Troubleshooting Common Issues ### Issue: Renderers Show Black Screen After Zoom Changes This can occur when zoom values exceed valid ranges or when renderers aren't properly initialized: ```cs // Reset zoom settings to default for all renderers public void ResetZoomSettings() { for (int i = 0; i < VideoCapture1.MultiScreen_Count; i++) { VideoCapture1.MultiScreen_SetZoom(i, 100, 0, 0); // 100% zoom, no shift } } ``` ### Issue: Distorted Image After Zoom Extreme zoom values can cause distortion. Implement boundaries for zoom values: ```cs public void SetSafeZoom(int screenIndex, int requestedZoom, int shiftX, int shiftY) { // Clamp values to safe ranges int safeZoom = Math.Clamp(requestedZoom, 10, 200); // 10% to 200% int safeShiftX = Math.Clamp(shiftX, -100, 100); int safeShiftY = Math.Clamp(shiftY, -100, 100); VideoCapture1.MultiScreen_SetZoom(screenIndex, safeZoom, safeShiftX, safeShiftY); } ``` ## Conclusion Properly configured multiple video renderers with independent zoom settings can significantly enhance the user experience in multimedia applications. By following the guidelines and best practices outlined in this document, you can implement sophisticated video display configurations tailored to your specific application requirements. For additional code examples and implementation guidance, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\guides\video-capture-to-mpegts.md --- title: Video Capture to MPEG-TS Files in C# and .NET description: Learn how to capture video and audio to MPEG-TS files in C# applications. Step-by-step guide with code samples covering hardware acceleration, format selection, and cross-platform considerations for .NET developers. sidebar_label: Video Capture to MPEG-TS --- # Video Capture to MPEG-TS in C# and .NET: Complete Guide ## Introduction This technical guide demonstrates how to capture C# TS video from cameras and microphones using two powerful VisioForge multimedia solutions: Video Capture SDK .NET with VideoCaptureCoreX engine and Media Blocks SDK .NET with MediaBlocksPipeline engine. Both SDKs provide robust capabilities for capturing, recording, and editing TS (MPEG Transport Stream) files in .NET applications. We'll explore detailed code samples for implementing video/audio capture to TS in C# with optimized performance and quality. ## Installation and deployment Please refer to the [installation guide](../../install/index.md) for detailed instructions on how to install the VisioForge .NET SDKs on your system. ## Video Capture SDK .NET (VideoCaptureCoreX) - Capture MPEG-TS in C# VideoCaptureCoreX provides a streamlined approach to capture TS video and audio in C#. Its component-based architecture handles the complex media pipeline, allowing developers to focus on configuration rather than lower-level implementation details when working with TS files in .NET. ### Core Components 1. **VideoCaptureCoreX**: Main engine for managing video capture, rendering, and TS output. 2. **VideoView**: UI component for real-time video rendering during capture. 3. **DeviceEnumerator**: Class for discovering video/audio devices. 4. **VideoCaptureDeviceSourceSettings**: Configuration for camera input when capturing MPEG-TS. 5. **AudioRendererSettings**: Configuration for audio playback with AAC support. 6. **MPEGTSOutput**: Configuration specifically for MPEG-TS file output. ### Implementation Example Here's a complete C# implementation to capture and record MPEG-TS files: ```csharp // Class instance for video capture engine VideoCaptureCoreX videoCapture; private async Task StartCaptureAsync() { // Initialize the VisioForge SDK await VisioForgeX.InitSDKAsync(); // Create VideoCaptureCoreX instance and associate with UI VideoView control videoCapture = new VideoCaptureCoreX(videoView: VideoView1); // Get list of available video capture devices var videoSources = await DeviceEnumerator.Shared.VideoSourcesAsync(); // Initialize video source settings VideoCaptureDeviceSourceSettings videoSourceSettings = null; // Get first available video capture device var videoDevice = videoSources[0]; // Try to get HD resolution and frame rate capabilities from device var videoFormat = videoDevice.GetHDVideoFormatAndFrameRate(out VideoFrameRate frameRate); if (videoFormat != null) { // Configure video source with HD format videoSourceSettings = new VideoCaptureDeviceSourceSettings(videoDevice) { Format = videoFormat.ToFormat() }; // Set capture frame rate videoSourceSettings.Format.FrameRate = frameRate; } // Configure video capture device with settings videoCapture.Video_Source = videoSourceSettings; // Configure audio capture (microphone) // Initialize audio source settings IVideoCaptureBaseAudioSourceSettings audioSourceSettings = null; // Get available audio capture devices using DirectSound API var audioApi = AudioCaptureDeviceAPI.DirectSound; var audioDevices = await DeviceEnumerator.Shared.AudioSourcesAsync(audioApi); // Get first available audio capture device var audioDevice = audioDevices[0]; if (audioDevice != null) { // Get default audio format supported by device var audioFormat = audioDevice.GetDefaultFormat(); if (audioFormat != null) { // Configure audio source with default format audioSourceSettings = audioDevice.CreateSourceSettingsVC(audioFormat); } } // Configure audio capture device with settings videoCapture.Audio_Source = audioSourceSettings; // Configure audio playback device // Get first available audio output device var audioOutputDevice = (await DeviceEnumerator.Shared.AudioOutputsAsync())[0]; // Configure audio renderer for playback through selected device videoCapture.Audio_OutputDevice = new AudioRendererSettings(audioOutputDevice); // Enable audio monitoring and recording videoCapture.Audio_Play = true; // Enable real-time audio monitoring videoCapture.Audio_Record = true; // Enable audio recording to output file // Configure MPEG Transport Stream output var mpegtsOutput = new MPEGTSOutput("output.ts"); // Configure video encoder with hardware acceleration if available if (NVENCH264EncoderSettings.IsAvailable()) { // Use NVIDIA hardware encoder mpegtsOutput.Video = new NVENCH264EncoderSettings(); } else if (QSVH264EncoderSettings.IsAvailable()) { // Use Intel Quick Sync hardware encoder mpegtsOutput.Video = new QSVH264EncoderSettings(); } else if (AMFH264EncoderSettings.IsAvailable()) { // Use AMD hardware encoder mpegtsOutput.Video = new AMFH264EncoderSettings(); } else { // Fallback to software encoder mpegtsOutput.Video = new OpenH264EncoderSettings(); } // Configure audio encoder for MPEG-TS output // mpegtsOutput.Audio = ... // Add MPEG-TS output to capture pipeline // autostart: true means output starts automatically with capture videoCapture.Outputs_Add(mpegtsOutput, autostart: true); // Start the capture process await videoCapture.StartAsync(); } private async Task StopCaptureAsync() { // Stop all capture and encoding await videoCapture.StopAsync(); // Clean up resources await videoCapture.DisposeAsync(); } ``` ### VideoCaptureCoreX Advanced Features for MPEG-TS Recording 1. **Hardware Acceleration**: Support for NVIDIA (NVENC), Intel (QSV), and AMD (AMF) hardware encoding. 2. **Format Selection**: The SDK provides access to the native camera formats and frame rates. 3. **Audio Configuration**: Provides volume control and format selection. 4. **Multiple Outputs**: Ability to add multiple output formats simultaneously. ## Media Blocks SDK .NET (MediaBlocksPipeline) - Capture TS in C# The MediaBlocksPipeline engine in Media Blocks SDK .Net takes a different architectural approach, focusing on a modular block-based system where each component (block) has specific responsibilities in the media processing pipeline. ### Core Blocks 1. **MediaBlocksPipeline**: The main container and controller for the media blocks pipeline. 2. **SystemVideoSourceBlock**: Captures video from webcams. 3. **SystemAudioSourceBlock**: Captures audio from microphones. 4. **VideoRendererBlock**: Renders the video to a VideoView control. 5. **AudioRendererBlock**: Handles audio playback. 6. **TeeBlock**: Splits media streams for simultaneous processing (e.g., display and encoding). 7. **H264EncoderBlock**: Encodes video using H.264. 8. **AACEncoderBlock**: Encodes audio using AAC. 9. **MPEGTSSinkBlock**: Saves encoded streams to an MPEG-TS file. ### Implementation Example Here's how to implement advanced capture of TS files in C#: ```csharp // Pipeline instance MediaBlocksPipeline pipeline; private async Task StartCaptureAsync() { // Initialize the SDK await VisioForgeX.InitSDKAsync(); // Create new pipeline instance pipeline = new MediaBlocksPipeline(); // Get first available video device and configure HD format var device = (await DeviceEnumerator.Shared.VideoSourcesAsync())[0]; var formatItem = device.GetHDVideoFormatAndFrameRate(out VideoFrameRate frameRate); var videoSourceSettings = new VideoCaptureDeviceSourceSettings(device) { Format = formatItem.ToFormat() }; videoSourceSettings.Format.FrameRate = frameRate; // Create video source block with configured settings var videoSource = new SystemVideoSourceBlock(videoSourceSettings); // Get first available audio device and configure default format var audioDevice = (await DeviceEnumerator.Shared.AudioSourcesAsync())[0]; var audioFormat = audioDevice.GetDefaultFormat(); var audioSourceSettings = audioDevice.CreateSourceSettings(audioFormat); var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // Create video renderer block and connect to UI VideoView control var videoRenderer = new VideoRendererBlock(pipeline, videoView: VideoView1) { IsSync = false }; // Create audio renderer block for playback var audioRenderer = new AudioRendererBlock() { IsSync = false }; // Note: IsSync is false to maximize encoding performance // Create video and audio tees var videoTee = new TeeBlock(2, MediaBlockPadMediaType.Video); var audioTee = new TeeBlock(2, MediaBlockPadMediaType.Audio); // Create MPEG-TS muxer var muxer = new MPEGTSSinkBlock(new MPEGTSSinkSettings("output.ts")); // Create video and audio encoders with hardware acceleration if available var videoEncoder = new H264EncoderBlock(); var audioEncoder = new AACEncoderBlock(); // Connect video processing blocks: // Source -> Tee -> Renderer (preview) and Encoder -> Muxer pipeline.Connect(videoSource.Output, videoTee.Input); pipeline.Connect(videoTee.Outputs[0], videoRenderer.Input); pipeline.Connect(videoTee.Outputs[1], videoEncoder.Input); pipeline.Connect(videoEncoder.Output, (muxer as IMediaBlockDynamicInputs).CreateNewInput(MediaBlockPadMediaType.Video)); // Connect audio processing blocks: // Source -> Tee -> Renderer (playback) and Encoder -> Muxer pipeline.Connect(audioSource.Output, audioTee.Input); pipeline.Connect(audioTee.Outputs[0], audioRenderer.Input); pipeline.Connect(audioTee.Outputs[1], audioEncoder.Input); pipeline.Connect(audioEncoder.Output, (muxer as IMediaBlockDynamicInputs).CreateNewInput(MediaBlockPadMediaType.Audio)); // Start the pipeline processing await pipeline.StartAsync(); } private async Task StopCaptureAsync() { // Stop all pipeline processing await pipeline.StopAsync(); // Clean up resources await pipeline.DisposeAsync(); pipeline = null; } ``` ### MediaBlocksPipeline Advanced Features 1. **Fine-Grained Control**: Direct control over each processing step in the pipeline. 2. **Dynamic Pipeline Construction**: Ability to create complex processing pipelines by connecting blocks. 3. **Multiple Processing Paths**: Using TeeBlock to split streams for different processing paths. 4. **Custom Blocks**: Ability to create and integrate custom processing blocks. 5. **Granular Error Handling**: Error events at each block level. ## TS Output Configuration with AAC Audio Both SDKs provide robust support for MPEG-TS output, which is particularly useful for broadcasting and streaming applications due to its error resilience and low latency characteristics. Read more about video and audio encoders available for TS capture in .NET: - [H264 encoders](../video-encoders/h264.md) - [HEVC encoders](../video-encoders/hevc.md) - [AAC encoders](../audio-encoders/aac.md) - [MP3 encoders](../audio-encoders/mp3.md) - [MPEG-TS output](../output-formats/mpegts.md) ## Cross-Platform Considerations Both SDKs offer cross-platform capabilities, but with different approaches: 1. **VideoCaptureCoreX**: Provides a unified API across platforms with platform-specific implementations. 2. **MediaBlocksPipeline**: Uses a consistent block-based architecture across platforms, with blocks handling platform differences internally. ## Sample applications - [VideoCaptureCoreX Sample Application](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Video%20Capture%20SDK%20X/WPF/CSharp/Simple%20Video%20Capture) - [MediaBlocksPipeline Sample Application](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) ## Conclusion: Choosing the Right SDK for C# MPEG-TS Capture VisioForge offers two powerful solutions for recording MPEG-TS files in C# and .NET: - **VideoCaptureCoreX** provides a streamlined API for quick implementation of MPEG-TS capture in C#, ideal for projects where ease of use is essential. - **MediaBlocksPipeline** offers maximum flexibility for complex MPEG-TS recording and editing scenarios in .NET through its modular block architecture. Both SDKs excel at capturing video from cameras and audio from microphones, with comprehensive support for MPEG-TS output, making them valuable tools for developing a wide range of multimedia applications. Choose VideoCaptureCoreX for rapid implementation of standard TS capture scenarios, or MediaBlocksPipeline for advanced editing and custom processing workflows with TS files in your .NET applications. ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\adobe-flash.md --- title: Network Video Streaming to Flash Media Server description: Learn how to implement network video streaming to Adobe Flash Media Server in .NET applications. Tutorial covers real-time effects, quality settings, and device switching for professional video streaming solutions. sidebar_label: Adobe Flash Media Server --- # Streaming to Adobe Flash Media Server: Advanced Implementation Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) ## Introduction Adobe Flash Media Server (FMS) remains a powerful solution for streaming video content across various platforms. This guide demonstrates how to implement high-quality video streaming to Adobe Flash Media Server using VisioForge's .NET SDKs. The integration supports real-time video effects, quality adjustment, and seamless device switching during streaming sessions. ## Prerequisites Before implementing the streaming functionality, ensure you have: - VisioForge Video Capture SDK .NET or Video Edit SDK .NET installed - Adobe Flash Media Server (or a compatible service like Wowza with RTMP support) - Adobe Flash Media Live Encoder (FMLE) - .NET Framework 4.7.2 or later - Visual Studio 2022 or newer - Basic understanding of C# programming ## Demo Application Walkthrough The demo application provided with VisioForge SDKs offers a straightforward way to test streaming functionality. Here's a detailed walkthrough: 1. Start the Main Demo application 2. Navigate to the "Network Streaming" tab 3. Enable streaming by selecting the "Enabled" checkbox 4. Select the "External" radio button for external encoder compatibility 5. Start preview or capture to initialize the video stream 6. Open Adobe Flash Media Live Encoder 7. Configure FMLE to use "VisioForge Network Source" as the video source 8. Configure video parameters: - Resolution (e.g., 1280x720, 1920x1080) - Frame rate (typically 25-30 fps for smooth streaming) - Keyframe interval (recommend 2 seconds) - Video quality settings 9. Select "VisioForge Network Source Audio" as the audio source 10. Configure your connection to Adobe Flash Media Server 11. Press Start to initiate streaming The video from the SDK is now being streamed to your FMS instance. You can apply real-time effects, adjust settings, or even stop the SDK to switch input devices without terminating the streaming session on the server side. ## Implementation in Custom Applications ### Required Components To implement this functionality in your custom application, you'll need: - SDK redistributables (available in the SDK installation package) - References to the VisioForge SDK assemblies - Proper firewall and network configurations to allow streaming ## Required Redistributables Ensure the following components are included with your application: - VisioForge SDK redistributable packages - Microsoft Visual C++ Runtime (appropriate version for your SDK) - .NET Framework runtime (if not using self-contained deployment) ## Conclusion Streaming to Adobe Flash Media Server using VisioForge's Video Capture or Edit SDKs offers a flexible and powerful solution for implementing high-quality video streaming in .NET applications. The implementation supports real-time effects, quality adjustments, and seamless device switching, making it suitable for a wide range of streaming applications. By following this guide, developers can implement robust streaming solutions that leverage the powerful features of both the VisioForge SDKs and Adobe's streaming platform. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples and example projects. ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\aws-s3.md --- title: Stream Video and Audio to Amazon S3 Storage description: Learn how to implement AWS S3 video and audio streaming in .NET applications. Step-by-step guide for developers covering configuration, encoding settings, error handling, and best practices for S3 media output integration. sidebar_label: AWS S3 --- # AWS S3 Output [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The AWS S3 Output functionality in VisioForge SDKs enables direct video and audio output streaming to Amazon S3 storage. This guide will walk you through setting up and using AWS S3 output in your applications. ## Overview The `AWSS3Output` class is a specialized output handler within the VisioForge SDKs that facilitates video and audio output streaming to Amazon Web Services (AWS) S3 storage. This class implements multiple interfaces to support both video editing (`IVideoEditXBaseOutput`) and video capture (`IVideoCaptureXBaseOutput`) scenarios, along with processing capabilities for both video and audio content. ## Class Implementation ```csharp public class AWSS3Output : IVideoEditXBaseOutput, IVideoCaptureXBaseOutput, IOutputVideoProcessor, IOutputAudioProcessor ``` ## Key Features The `AWSS3Output` class provides a comprehensive solution for streaming media content to AWS S3 by managing: - Video encoding configuration - Audio encoding configuration - Custom media processing - AWS S3-specific settings ## Properties ### Video Encoder Settings ```csharp public IVideoEncoder Video { get; set; } ``` Controls the video encoding process. The selected video encoder must be compatible with the configured sink settings. This property allows you to specify compression methods, quality settings, and other video-specific parameters. ### Audio Encoder Settings ```csharp public IAudioEncoder Audio { get; set; } ``` Manages audio encoding configuration. Like the video encoder, the audio encoder must be compatible with the sink settings. This property enables control over audio quality, compression, and format settings. ### Sink Settings ```csharp public IMediaBlockSettings Sink { get; set; } ``` Defines the output destination configuration. In this context, it contains AWS S3-specific settings for the media output stream. ### Custom Processing Blocks ```csharp public MediaBlock CustomVideoProcessor { get; set; } ``` ```csharp public MediaBlock CustomAudioProcessor { get; set; } ``` Allow for additional processing of video and audio streams before they are encoded and uploaded to S3. These blocks can be used for implementing custom filters, transformations, or analysis of the media content. ### AWS S3 Configuration ```csharp public AWSS3SinkSettings Settings { get; set; } ``` Contains all AWS S3-specific configuration options, including: - Access credentials (Access Key, Secret Access Key) - Bucket and object key information - Region configuration - Upload behavior settings - Error handling preferences ## Constructor ```csharp public AWSS3Output(AWSS3SinkSettings settings, IVideoEncoder videoEnc, IAudioEncoder audioEnc, IMediaBlockSettings sink) ``` Creates a new instance of the `AWSS3Output` class with the specified configuration: - `settings`: AWS S3-specific configuration - `videoEnc`: Video encoder settings - `audioEnc`: Audio encoder settings - `sink`: Media sink configuration ## Methods ### File Management ```csharp public string GetFilename() ``` ```csharp public void SetFilename(string filename) ``` These methods manage the URI of the S3 object: - `GetFilename()`: Returns the current S3 URI - `SetFilename(string filename)`: Sets the S3 URI for the output ### Encoder Support All encoders are supported. Be sure that encoder settings are compatible with the sink settings. ## Usage Example ```csharp // Create AWS S3 sink settings var s3Settings = new AWSS3SinkSettings { AccessKey = "your-access-key", SecretAccessKey = "your-secret-key", Bucket = "your-bucket-name", Key = "output-file-key", Region = "us-west-1" }; // Configure encoders IVideoEncoder videoEncoder = /* your video encoder configuration */; IAudioEncoder audioEncoder = /* your audio encoder configuration */; IMediaBlockSettings sinkSettings = /* your sink settings */; // Create the AWS S3 output var s3Output = new AWSS3Output(s3Settings, videoEncoder, audioEncoder, sinkSettings); // Optional: Configure custom processors s3Output.CustomVideoProcessor = /* your custom video processor */; s3Output.CustomAudioProcessor = /* your custom audio processor */; ``` ## Best Practices 1. Always ensure your AWS credentials are properly secured and not hard-coded in the application. 2. Configure appropriate retry attempts and request timeouts based on your network conditions and file sizes. 3. Select compatible video and audio encoders for your target use case. 4. Consider implementing custom processors for specific requirements like watermarking or audio normalization. ## Error Handling The class works in conjunction with the `S3SinkOnError` enumeration defined in `AWSS3SinkSettings`, which provides three error handling strategies: - Abort: Stops the upload process on error - Complete: Attempts to complete the upload despite errors - DoNothing: Ignores errors during upload ## Related Components - AWSS3SinkSettings: Contains detailed configuration for AWS S3 connectivity - IVideoEncoder: Interface for video encoding configuration - IAudioEncoder: Interface for audio encoding configuration - IMediaBlockSettings: Interface for media output configuration ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\facebook.md --- title: Facebook Live Integration for .NET Development description: Learn how to implement Facebook Live streaming in .NET applications with hardware-accelerated video encoding, real-time broadcasting, and platform-specific optimizations. Master RTMP streaming with code examples and best practices. sidebar_label: Facebook Live --- # Facebook Live Streaming with VisioForge SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Facebook Live Streaming Facebook Live provides a powerful platform for real-time video broadcasting to global audiences. Whether you're developing applications for live events, video conferencing, gaming streams, or social media integration, VisioForge SDKs offer robust solutions for implementing Facebook Live streaming in your .NET applications. This comprehensive guide explains how to implement Facebook Live streaming using VisioForge's suite of SDKs, with detailed code examples and configuration options for different platforms and hardware configurations. ## Core Components for Facebook Live Integration [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The cornerstone of Facebook Live integration in VisioForge is the `FacebookLiveOutput` class, which provides a complete implementation of the RTMP protocol required for Facebook streaming. This class implements multiple interfaces to ensure compatibility across various SDK components: - `IVideoEditXBaseOutput` - For Video Edit SDK integration - `IVideoCaptureXBaseOutput` - For Video Capture SDK integration - `IOutputVideoProcessor` - For video stream processing - `IOutputAudioProcessor` - For audio stream processing This multi-interface implementation ensures seamless operation across the entire VisioForge ecosystem, allowing developers to maintain consistent code while working with different SDK components. ## Setting Up Facebook Live Streaming ### Prerequisites Before implementing Facebook Live streaming in your application, you'll need: 1. A Facebook account with permissions to create Live streams 2. A valid Facebook streaming key (obtained from Facebook Live Producer) 3. VisioForge SDK installed in your .NET project 4. Sufficient bandwidth for the chosen quality settings ### Basic Implementation The most basic implementation of Facebook Live streaming requires just a few lines of code: ```csharp // Create Facebook Live output with your streaming key var facebookOutput = new FacebookLiveOutput("your_facebook_streaming_key_here"); // Add to your VideoCaptureCoreX instance captureCore.Outputs_Add(facebookOutput, true); // Or set as output format for VideoEditCoreX editCore.Output_Format = facebookOutput; ``` This minimal setup uses the default encoders, which VisioForge selects based on your platform for optimal performance. For most applications, these defaults provide excellent results with minimal configuration overhead. ## Optimizing Video Encoding for Facebook Live ### Supported Video Encoders Facebook Live requires H.264 or HEVC encoded video. VisioForge supports multiple encoder implementations to leverage different hardware capabilities: #### H.264 Encoders | Encoder | Platform Support | Hardware Acceleration | Performance Characteristics | |---------|------------------|------------------------|----------------------------| | OpenH264 | Cross-platform | Software-based | CPU-intensive, universal compatibility | | NVENC H264 | Windows, Linux | NVIDIA GPU | High performance, low CPU usage | | QSV H264 | Windows, Linux | Intel GPU | Efficient on Intel systems | | AMF H264 | Windows | AMD GPU | Optimized for AMD hardware | #### HEVC Encoders | Encoder | Platform Support | Hardware Acceleration | |---------|------------------|------------------------| | MF HEVC | Windows only | DirectX Video Acceleration | | NVENC HEVC | Windows, Linux | NVIDIA GPU | | QSV HEVC | Windows, Linux | Intel GPU | | AMF H265 | Windows | AMD GPU | ### Selecting the Optimal Video Encoder VisioForge provides utility methods to check hardware encoder availability before attempting to use them: ```csharp // Video encoder selection with fallback options IVideoEncoderSettings GetOptimalVideoEncoder() { // Try NVIDIA GPU acceleration first if (NVENCH264EncoderSettings.IsAvailable()) { return new NVENCH264EncoderSettings(); } // Fall back to Intel Quick Sync if available if (QSVH264EncoderSettings.IsAvailable()) { return new QSVH264EncoderSettings(); } // Fall back to AMD acceleration if (AMFH264EncoderSettings.IsAvailable()) { return new AMFH264EncoderSettings(); } // Finally fall back to software encoding return new OpenH264EncoderSettings(); } // Apply the optimal encoder to Facebook output facebookOutput.Video = GetOptimalVideoEncoder(); ``` This cascading approach ensures your application uses the best available encoder on the user's system, maximizing performance while maintaining compatibility. ## Audio Encoding Configuration Audio quality significantly impacts the viewer experience. VisioForge supports multiple AAC encoder implementations to ensure optimal audio for Facebook streams: ### Supported Audio Encoders 1. **VO-AAC** - VisioForge's optimized AAC encoder (default for non-Windows platforms) 2. **AVENC AAC** - FFmpeg-based AAC encoder with wide platform support 3. **MF AAC** - Microsoft Media Foundation AAC encoder (Windows-only, hardware-accelerated) ```csharp // Platform-specific audio encoder selection IAudioEncoderSettings GetOptimalAudioEncoder() { IAudioEncoderSettings audioEncoder; #if NET_WINDOWS // Use Media Foundation on Windows audioEncoder = new MFAACEncoderSettings(); // Configure for stereo, 44.1kHz sample rate ((MFAACEncoderSettings)audioEncoder).Channels = 2; ((MFAACEncoderSettings)audioEncoder).SampleRate = 44100; #else // Use VisioForge optimized AAC on other platforms audioEncoder = new VOAACEncoderSettings(); // Configure for stereo, 44.1kHz sample rate ((VOAACEncoderSettings)audioEncoder).Channels = 2; ((VOAACEncoderSettings)audioEncoder).SampleRate = 44100; #endif return audioEncoder; } // Apply the optimal audio encoder facebookOutput.Audio = GetOptimalAudioEncoder(); ``` ## Advanced Facebook Live Features ### Custom Media Processing Pipeline For applications requiring advanced video or audio processing before streaming, VisioForge supports insertion of custom processors: ```csharp // Add text overlay to video stream var textOverlay = new TextOverlayBlock(new TextOverlaySettings("Live from VisioForge SDK")); // Add the video processor to Facebook output facebookOutput.CustomVideoProcessor = textOverlay; // Add audio volume boost var volume = new VolumeBlock(); volume.Level = 1.2; // Boost 20% volume // Add the audio processor to Facebook output facebookOutput.CustomAudioProcessor = volume; ``` ### Platform-Specific Optimizations VisioForge automatically applies platform-specific optimizations: - **Windows**: Leverages Media Foundation for AAC audio and DirectX Video Acceleration - **macOS**: Uses Apple Media frameworks for hardware-accelerated encoding - **Linux**: Employs VAAPI and other platform-specific acceleration when available These optimizations ensure your application achieves maximum performance regardless of the deployment platform. ## Complete Implementation Example Here's a comprehensive example showing how to set up a complete Facebook Live streaming pipeline with error handling and optimal encoder selection: ```csharp public FacebookLiveOutput ConfigureFacebookLiveStream(string streamKey, int videoBitrate = 4000000) { // Create the Facebook output with the provided stream key var facebookOutput = new FacebookLiveOutput(streamKey); try { // Configure optimal video encoder with fallback strategy if (NVENCH264EncoderSettings.IsAvailable()) { var nvencSettings = new NVENCH264EncoderSettings(); nvencSettings.BitRate = videoBitrate; facebookOutput.Video = nvencSettings; } else if (QSVH264EncoderSettings.IsAvailable()) { var qsvSettings = new QSVH264EncoderSettings(); qsvSettings.BitRate = videoBitrate; facebookOutput.Video = qsvSettings; } else { // Software fallback var openH264 = new OpenH264EncoderSettings(); openH264.BitRate = videoBitrate; facebookOutput.Video = openH264; } // Configure platform-optimal audio encoder #if NET_WINDOWS facebookOutput.Audio = new MFAACEncoderSettings(); #else facebookOutput.Audio = new VOAACEncoderSettings(); #endif // Set additional stream parameters facebookOutput.Sink.Key = streamKey; return facebookOutput; } catch (Exception ex) { Console.WriteLine($"Error configuring Facebook Live output: {ex.Message}"); throw; } } // Usage with VideoCaptureCoreX var captureCore = new VideoCaptureCoreX(); var facebookOutput = ConfigureFacebookLiveStream("your_streaming_key_here"); captureCore.Outputs_Add(facebookOutput, true); await captureCore.StartAsync(); // Usage with VideoEditCoreX var editCore = new VideoEditCoreX(); // Add sources // ... // Set output format editCore.Output_Format = ConfigureFacebookLiveStream("your_streaming_key_here"); // Start await editCore.StartAsync(); ``` ## Media Blocks SDK Integration For developers requiring even more granular control, the Media Blocks SDK provides a modular approach to Facebook Live streaming: ```csharp // Create a pipeline var pipeline = new MediaBlocksPipeline(); // Add video source (camera, screen capture, etc.) var videoSource = new SomeVideoSourceBlock(); // Add audio source (microphone, system audio, etc.) var audioSource = new SomeAudioSourceBlock(); // Add video encoder (H.264) var h264Encoder = new H264EncoderBlock(videoEncoderSettings); // Add audio encoder (AAC) var aacEncoder = new AACEncoderBlock(audioEncoderSettings); // Create Facebook Live sink var facebookSink = new FacebookLiveSinkBlock( new FacebookLiveSinkSettings("your_streaming_key_here") ); // Connect blocks pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); pipeline.Connect(h264Encoder.Output, facebookSink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, facebookSink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start the pipeline pipeline.Start(); ``` ## Troubleshooting and Best Practices ### Common Issues and Solutions 1. **Stream Connection Failures** - Verify Facebook stream key validity and expiration status - Check network connectivity and firewall settings - Facebook requires port 80 (HTTP) and 443 (HTTPS) to be open 2. **Encoder Initialization Problems** - Always check hardware encoder availability before attempting to use them - Ensure GPU drivers are up-to-date for hardware acceleration - Fall back to software encoders when hardware acceleration is unavailable 3. **Performance Optimization** - Monitor CPU and GPU usage during streaming - Adjust video resolution and bitrate based on available bandwidth - Consider separate threads for video capture and encoding operations ### Quality and Security Best Practices 1. **Stream Key Security** - Never hardcode stream keys in your application - Store keys securely and consider runtime key retrieval from a secure API - Implement key rotation mechanisms for enhanced security 2. **Quality Settings Recommendations** - For HD streaming (1080p): 4-6 Mbps video bitrate, 128-192 Kbps audio - For SD streaming (720p): 2-4 Mbps video bitrate, 128 Kbps audio - Mobile-optimized: 1-2 Mbps video bitrate, 64-96 Kbps audio 3. **Resource Management** - Implement proper disposal of SDK resources - Monitor memory usage for long-running streams - Implement graceful error recovery mechanisms By implementing these best practices, your application will deliver reliable, high-quality Facebook Live streaming across a wide range of devices and network conditions. ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\hls-streaming.md --- title: Implementing HLS Network Streaming in .NET description: Learn how to build HTTP Live Streaming (HLS) applications in .NET. Step-by-step guide covering adaptive bitrate streaming, video encoding, server setup, and cross-platform playback integration for modern streaming solutions. sidebar_label: HLS Network Streaming --- # Complete Guide to HLS Network Streaming Implementation in .NET [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## What is HTTP Live Streaming (HLS)? HTTP Live Streaming (HLS) is an adaptive bitrate streaming communications protocol designed and developed by Apple Inc. First introduced in 2009, it has since become one of the most widely adopted streaming protocols across various platforms and devices. HLS works by breaking the overall stream into a sequence of small HTTP-based file downloads, each containing a short segment of the overall content. ### Key Features of HLS Streaming - **Adaptive Bitrate Streaming**: HLS automatically adjusts video quality based on the viewer's network conditions, ensuring optimal playback quality without buffering. - **Cross-Platform Compatibility**: Works across iOS, macOS, Android, Windows, and most modern web browsers. - **HTTP-Based Delivery**: Leverages standard web server infrastructure, allowing content to pass through firewalls and proxy servers. - **Media Encryption and Authentication**: Supports content protection through encryption and various authentication methods. - **Live and On-Demand Content**: Can be used for both live broadcasting and pre-recorded media. ### HLS Technical Structure HLS content delivery relies on three key components: 1. **Manifest File (.m3u8)**: A playlist file that contains metadata about the various streams available 2. **Segment Files (.ts)**: The actual media content, divided into small chunks (typically 2-10 seconds each) 3. **HTTP Server**: Responsible for delivering both manifest and segment files Since HLS is entirely HTTP-based, you'll need either a dedicated HTTP server or can use the lightweight internal server provided by our SDKs. ## Implementing HLS Streaming with Media Blocks SDK The Media Blocks SDK offers a comprehensive approach to HLS streaming through its pipeline architecture, giving developers granular control over each aspect of the streaming process. ### Creating a Basic HLS Stream The following example demonstrates how to set up an HLS stream using Media Blocks SDK: ```csharp // Set URL const string URL = "http://localhost:8088/"; // Create H264 encoder var h264Settings = new OpenH264EncoderSettings(); var h264Encoder = new H264EncoderBlock(h264Settings); // Create AAC encoder var aacEncoder = new AACEncoderBlock(); // Create HLS sink var settings = new HLSSinkSettings { Location = Path.Combine(AppContext.BaseDirectory, "segment_%05d.ts"), MaxFiles = 10, PlaylistLength = 5, PlaylistLocation = Path.Combine(AppContext.BaseDirectory, "playlist.m3u8"), PlaylistRoot = URL, SendKeyframeRequests = true, TargetDuration = 5, Custom_HTTP_Server_Enabled = true, // Use internal HTTP server Custom_HTTP_Server_Port = 8088 // Port for internal HTTP server }; var hlsSink = new HLSSinkBlock(settings); // Connect video and audio sources to encoders (we assume that videoSource and audioSource are already created) pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // Connect encoders to HLS sink pipeline.Connect(h264Encoder.Output, hlsSink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, hlsSink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` ### Advanced Configuration Options The Media Blocks SDK offers several advanced configuration options for HLS streaming: - **Multiple Bitrate Variants**: Create different quality levels for adaptive streaming - **Custom Segment Duration**: Optimize for different types of content and viewing environments - **Server-Side Options**: Configure cache control headers and other server behaviors - **Security Features**: Implement token-based authentication or encryption You can use this SDK to stream both live video capture and existing media files to HLS. The flexible pipeline architecture allows for extensive customization of the media processing workflow. ## HLS Streaming with Video Capture SDK .NET Video Capture SDK .NET provides a streamlined approach to HLS streaming specifically designed for live video sources like webcams, capture cards, and other input devices. ### VideoCaptureCoreX Implementation The VideoCaptureCoreX engine offers a modern, object-oriented approach to video capture and streaming: ```csharp // Create HLS sink settings var settings = new HLSSinkSettings { Location = Path.Combine(AppContext.BaseDirectory, "segment_%05d.ts"), MaxFiles = 10, PlaylistLength = 5, PlaylistLocation = Path.Combine(AppContext.BaseDirectory, "playlist.m3u8"), PlaylistRoot = edStreamingKey.Text, SendKeyframeRequests = true, TargetDuration = 5, Custom_HTTP_Server_Enabled = true, Custom_HTTP_Server_Port = new Uri(edStreamingKey.Text).Port }; // Create HLS output var hlsOutput = new HLSOutput(settings); // Create video and audio encoders with default settings hlsOutput.Video = new OpenH264EncoderSettings(); hlsOutput.Audio = new VOAACEncoderSettings(); // Add HLS output to video capture object videoCapture.Outputs_Add(hlsOutput, true); ``` ### VideoCaptureCore Implementation For those working with the traditional VideoCaptureCore engine, the implementation is slightly different but equally straightforward: ```csharp VideoCapture1.Network_Streaming_Enabled = true; VideoCapture1.Network_Streaming_Audio_Enabled = true; VideoCapture1.Network_Streaming_Format = NetworkStreamingFormat.HLS; var hls = new HLSOutput { HLS = { SegmentDuration = 10, // Segment duration in seconds NumSegments = 5, // Number of segments in playlist OutputFolder = "c:\\hls\\", // Output folder PlaylistType = HLSPlaylistType.Live, // Playlist type Custom_HTTP_Server_Enabled = true, // Use internal HTTP server Custom_HTTP_Server_Port = 8088 // Port for internal HTTP server } }; VideoCapture1.Network_Streaming_Output = hls; ``` ### Performance Considerations When streaming with Video Capture SDK, consider these performance optimization techniques: - Keep segment durations between 2-10 seconds (10 seconds is optimal for most use cases) - Adjust the number of segments based on expected viewing patterns - Use hardware acceleration when available for encoding - Configure appropriate bitrates based on your target audience's connection speeds ## Converting Media Files to HLS with Video Edit SDK .NET The Video Edit SDK .NET enables developers to convert existing media files into HLS format for streaming, ideal for video-on-demand applications. ### VideoEditCore Implementation ```csharp VideoEdit1.Network_Streaming_Enabled = true; VideoEdit1.Network_Streaming_Audio_Enabled = true; VideoEdit1.Network_Streaming_Format = NetworkStreamingFormat.HLS; var hls = new HLSOutput { HLS = { SegmentDuration = 10, // Segment duration in seconds NumSegments = 5, // Number of segments in playlist OutputFolder = "c:\\hls\\", // Output folder PlaylistType = HLSPlaylistType.Live, // Playlist type Custom_HTTP_Server_Enabled = true, // Use internal HTTP server Custom_HTTP_Server_Port = 8088 // Port for internal HTTP server } }; VideoEdit1.Network_Streaming_Output = hls; ``` ### File Format Considerations When converting files to HLS, consider these factors: - Not all input formats are equally efficient for conversion - MP4, MOV, and MKV files typically provide the best results - Highly compressed formats may require more processing power - Consider pre-transcoding very large files to an intermediate format ## Playback and Integration ### HTML5 Player Integration All applications implementing HLS streaming should include an HTML file with a video player. Modern HTML5 players like HLS.js, Video.js, or JW Player provide excellent support for HLS streams. Here's a basic example using HLS.js: ```html HLS Player ``` For a complete example player, refer to our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples/blob/master/Media%20Blocks%20SDK/Console/HLS%20Streamer/index.htm). ### Mobile App Integration Our SDKs also support integration with mobile applications through: - Native iOS playback using AVPlayer - Android playback via ExoPlayer - Cross-platform options like Xamarin or MAUI ## Troubleshooting Common Issues ### CORS Configuration When serving HLS content to web browsers, you may encounter Cross-Origin Resource Sharing (CORS) issues. Ensure your server is configured to send the proper CORS headers: ``` Access-Control-Allow-Origin: * Access-Control-Allow-Methods: GET, HEAD, OPTIONS Access-Control-Allow-Headers: Range Access-Control-Expose-Headers: Accept-Ranges, Content-Encoding, Content-Length, Content-Range ``` ### Latency Optimization HLS inherently introduces some latency. To minimize this: - Use shorter segment durations (2-4 seconds) for lower latency - Consider enabling Low-Latency HLS (LL-HLS) if supported - Optimize your network infrastructure for minimal delays ## Conclusion HLS streaming provides a robust, cross-platform solution for delivering both live and on-demand video content to a wide range of devices. With VisioForge's .NET SDKs, implementing HLS in your applications becomes straightforward, allowing you to focus on creating compelling content rather than wrestling with technical details. For more code samples and advanced implementations, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). --- ## Additional Resources - [HLS Specification](https://developer.apple.com/streaming/) ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\http-mjpeg.md --- title: HTTP MJPEG Video Streaming Implementation Guide description: Learn how to implement HTTP MJPEG video streaming in .NET applications. Step-by-step guide for setting up real-time video feeds, handling client connections, and managing stream delivery with code examples and best practices. sidebar_label: HTTP MJPEG --- # HTTP MJPEG streaming [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The SDK's feature of streaming video encoded as Motion JPEG (MJPEG) over HTTP is advantageous for its simplicity and broad compatibility. MJPEG encodes each video frame individually as a JPEG image, which simplifies decoding and is ideal for applications like web streaming and surveillance. The use of HTTP ensures easy integration and high compatibility across different platforms and devices and is effective even in networks with strict configurations. This method is particularly suitable for real-time video feeds and applications requiring straightforward frame-by-frame analysis. With adjustable frame rates and resolutions, the SDK offers flexibility for various network conditions and quality requirements, making it a versatile choice for developers implementing video streaming in their applications. ## Cross-platform MJPEG output [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The streaming functionality is implemented through two main classes: 1. `HTTPMJPEGLiveOutput`: The high-level configuration class that sets up the streaming output 2. `HTTPMJPEGLiveSinkBlock`: The underlying implementation block that handles the actual streaming process ### HTTPMJPEGLiveOutput Class This class serves as the configuration entry point for setting up an MJPEG HTTP stream. It implements the `IVideoCaptureXBaseOutput` interface, making it compatible with the video capture pipeline system. #### Key Properties - `Port`: Gets the network port number on which the MJPEG stream will be served #### Usage ```csharp // Create a new MJPEG streaming output on port 8080 var mjpegOutput = new HTTPMJPEGLiveOutput(8080); // Add the MJPEG output to the VideoCaptureCoreX engine core.Outputs_Add(mjpegOutput, true); ``` #### Implementation Details - The class is designed to be immutable, with the port being set only through the constructor - It does not support video or audio encoders, as MJPEG uses direct JPEG encoding - The filename-related methods return null or are no-ops, as this is a streaming-only implementation ### HTTPMJPEGLiveSinkBlock Class This class handles the actual implementation of the MJPEG streaming functionality. It's responsible for: - Setting up the pipeline for video processing - Managing the HTTP server for streaming - Handling input video data and converting it to MJPEG format - Managing client connections and stream delivery #### Key Features - Implements multiple interfaces for integration with the media pipeline: - `IMediaBlockInternals`: For pipeline integration - `IMediaBlockDynamicInputs`: For handling dynamic input connections - `IMediaBlockSink`: For sink functionality - `IDisposable`: For proper resource cleanup #### Input/Output Configuration - Accepts a single video input through the `Input` pad - No output pads (as it's a sink block) - Input pad configured for video media type only ### Implementation Notes #### Initialization ```csharp // The block must be initialized with a port number var mjpegSink = new HTTPMJPEGLiveSinkBlock(8080); pipeline.Connect(videoSource.Output, mjpegSink.Input); // "IMG tag URL is http://127.0.0.1:8090"; ``` #### Resource Management - The class implements proper resource cleanup through the `IDisposable` pattern - The `CleanUp` method ensures all resources are properly released - Event handlers are properly connected and disconnected during the pipeline lifecycle #### Pipeline Integration The `Build` method handles the critical setup process: 1. Creates the underlying HTTP MJPEG sink element 2. Initializes the sink with the specified port 3. Sets up the necessary GStreamer pad connections 4. Connects pipeline event handlers ### Error Handling - The implementation includes comprehensive error checking during the build process - Failed initialization is properly reported through the context error system - Resource cleanup is handled even in error cases ### Technical Considerations #### Performance - The implementation uses GStreamer's native elements for optimal performance - Direct pad connections minimize copying and overhead - The sink block is designed to handle multiple client connections efficiently #### Memory Management - Proper disposal patterns ensure no memory leaks - Resources are cleaned up when the pipeline stops or the block is disposed - The implementation handles GStreamer element lifecycle correctly #### Threading - The implementation is thread-safe for pipeline operations - Event handlers are properly synchronized with pipeline state changes - Client connections are handled asynchronously #### Client Usage To consume the MJPEG stream: 1. Initialize the streaming output with desired port 2. Connect it to your video pipeline 3. Access the stream through a web browser or HTTP client at: ``` http://[server-address]:[port] ``` #### Example Client HTML ```html ``` ### Limitations and Considerations 1. Bandwidth Usage - MJPEG streams can use significant bandwidth as each frame is a complete JPEG - Consider frame rate and resolution settings for optimal performance 2. Browser Support - While MJPEG is widely supported, some modern browsers may have limitations - Mobile devices may handle MJPEG streams differently 3. Latency - While MJPEG provides relatively low latency, it's not suitable for ultra-low-latency requirements - Network conditions can affect frame delivery timing ### Best Practices 1. Port Selection - Choose ports that don't conflict with other services - Consider firewall implications when selecting ports 2. Resource Management - Always dispose of the sink block properly - Monitor client connections and resource usage 3. Error Handling - Implement proper error handling for network and pipeline issues - Monitor the pipeline status for potential issues ### Security Considerations 1. Network Security - The MJPEG stream is unencrypted by default - Consider implementing additional security measures for sensitive content 2. Access Control - No built-in authentication mechanism - Consider implementing application-level access control if needed 3. Port Security - Ensure proper firewall rules are in place - Consider network isolation for internal streams ## Windows-only MJPEG output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] Set the `Network_Streaming_Enabled` property to true to enable network streaming. ```cs VideoCapture1.Network_Streaming_Enabled = true; ``` Set the HTTP MJPEG output. ```cs VideoCapture1.Network_Streaming_Format = NetworkStreamingFormat.HTTP_MJPEG; ``` Create the settings object and set the port. ```cs VideoCapture1.Network_Streaming_Output = new MJPEGOutput(8080); ``` --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\iis-smooth-streaming.md --- title: Guide to IIS Smooth Streaming Implementation description: Complete tutorial for implementing Microsoft IIS Smooth Streaming in .NET applications with VisioForge SDKs. Learn step-by-step configuration, adaptive bitrate streaming setup, mobile compatibility, and troubleshooting for high-quality video delivery across all devices. sidebar_label: IIS Smooth Streaming --- # Comprehensive Guide to IIS Smooth Streaming Implementation IIS Smooth Streaming is Microsoft's implementation of adaptive streaming technology that dynamically adjusts video quality based on network conditions and CPU capabilities. This guide provides detailed instructions on configuring and implementing IIS Smooth Streaming using VisioForge SDKs. ## Compatible VisioForge SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] ## Overview of IIS Smooth Streaming IIS Smooth Streaming provides several key advantages for developers and end-users: - **Adaptive bitrate streaming**: Automatically adjusts video quality based on available bandwidth - **Reduced buffering**: Minimizes playback interruptions during network fluctuations - **Broad device compatibility**: Works across desktops, mobile devices, smart TVs, and more - **Scalable delivery**: Handles large numbers of concurrent viewers efficiently This technology is particularly valuable for applications requiring high-quality video delivery across varied network conditions, such as live events, educational platforms, and media-rich applications. ## Prerequisites Before implementing IIS Smooth Streaming with VisioForge SDKs, ensure you have: 1. Windows Server with IIS installed 2. Administrative access to the server 3. Relevant VisioForge SDK (Video Capture SDK .Net or Video Edit SDK .Net) 4. Basic understanding of .NET development ## Step-by-Step IIS Configuration ### Installing Required Components 1. Install [Web Platform Installer](https://www.microsoft.com/web/downloads/platform.aspx) on your server. 2. Through the Web Platform Installer, search for and install IIS Media Services. ![IIS Media Services installation](https://www.visioforge.com/wp-content/uploads/2021/02/iis1.jpg) This component package includes all necessary modules for Smooth Streaming functionality, including the Live Smooth Streaming Publishing service. ### Configuring IIS Manager 1. Open IIS Manager on your server through the Start menu or by running `inetmgr` in the Run dialog. ![Opening IIS Manager](https://www.visioforge.com/wp-content/uploads/2021/02/iis2.jpg) 2. In the left navigation pane, locate and expand your server name, then select the site where you want to enable Smooth Streaming. ### Creating a Publishing Point 1. Within the selected site, find and open the "Live Smooth Streaming Publishing Points" feature. 2. Click "Add" to create a new publishing point. ![Adding a publishing point](https://www.visioforge.com/wp-content/uploads/2021/02/iis3.jpg) 3. Configure the basic settings for your publishing point: - **Name**: Provide a descriptive name for your publishing point (e.g., "MainStream") - **Path**: Specify the file path where the Smooth Streaming content will be stored ![Configuring publishing point name](https://www.visioforge.com/wp-content/uploads/2021/02/iis4.jpg) 4. Configure additional parameters by enabling the "Allow clients to connect to this publishing point" checkbox. This ensures that clients can connect and receive the streamed content. ![Additional publishing point settings](https://www.visioforge.com/wp-content/uploads/2021/02/iis5.jpg) ### Enabling Mobile Device Support To ensure your Smooth Streaming content is accessible on mobile devices: 1. In the publishing point configuration, navigate to the "Mobile Devices" tab. 2. Enable the checkbox for "Allow playback on mobile devices." ![Mobile device configuration](https://www.visioforge.com/wp-content/uploads/2021/02/iis6.jpg) This setting generates the necessary formats and manifests for mobile playback, significantly expanding your content's reach. ### Setting Up the Player To provide viewers with a way to watch your Smooth Streaming content: 1. Download the Smooth Streaming Player Silverlight control provided by Microsoft. 2. Extract the downloaded files and locate the `.xap` file. 3. Copy this `.xap` file to your website's directory. 4. Copy the included HTML file to the same directory and rename it to `index.html`. 5. Open `index.html` in a text editor and replace the "initparams" section with the following configuration: ```html ``` This configuration initializes the Silverlight player with optimal settings for Smooth Streaming playback. The `mediaurl` parameter should point to your publishing point's manifest. ### Starting the Publishing Point 1. Return to IIS Manager and select your configured publishing point. 2. Click the "Start" action in the right-hand panel. The publishing point will now be active and ready to receive content from your application. ## Implementing Smooth Streaming in VisioForge SDK Applications ### Basic Configuration To implement IIS Smooth Streaming in your VisioForge SDK application: 1. Open your application built with Video Capture SDK .Net or Video Edit SDK .Net. 2. Navigate to the network streaming settings section. 3. Enable network streaming functionality. 4. Select "Smooth Streaming" as the streaming method. 5. Enter the publishing point URL (e.g., `http://localhost/mainstream.isml`). 6. Configure additional streaming parameters as needed (bitrate, resolution, etc.). 7. Start the stream. ![Configuring Smooth Streaming in the SDK demo](https://www.visioforge.com/wp-content/uploads/2021/02/iis7.jpg) ### Verifying the Connection Once your application is configured: 1. Check the connection status in your application. You should see confirmation that the SDK has successfully connected to IIS. ![Successful IIS connection](https://www.visioforge.com/wp-content/uploads/2021/02/iis8.jpg) 2. Open a web browser and navigate to `http://localhost` (or your server address). 3. The Silverlight player should load and begin playing your stream. ![Stream playback in browser](https://www.visioforge.com/wp-content/uploads/2021/02/iis10.jpg) ### HTML5 Streaming for iOS Devices For broader device compatibility, particularly iOS devices that don't support Silverlight, create an HTML5 player: 1. Create a new HTML file in your website's directory. 2. Include the following code in the file: ```html Smooth Streaming HTML5 Player

HTML5 Smooth Streaming Player

``` This HTML5 player uses HLS (HTTP Live Streaming) format, which is automatically generated by IIS Media Services when you enable mobile device support. ## Required Redistributables To ensure your application functions correctly with IIS Smooth Streaming, include the following redistributables: - SDK redistributables for your specific VisioForge SDK - MP4 redistributables: - For x86 architectures: [VisioForge.DotNet.Core.Redist.MP4.x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MP4.x86/) - For x64 architectures: [VisioForge.DotNet.Core.Redist.MP4.x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MP4.x64/) You can add these packages through NuGet Package Manager in Visual Studio or via the command line: ``` Install-Package VisioForge.DotNet.Core.Redist.MP4.x64 ``` ## Advanced Configuration Options For production environments, consider these additional configurations: - **Multiple bitrate encoding**: Configure your VisioForge SDK to encode at multiple bitrates for optimal adaptive streaming - **Custom manifest settings**: Modify the Smooth Streaming manifest for specialized playback requirements - **Authentication**: Implement token-based authentication for secure streaming - **Content encryption**: Enable DRM protection for sensitive content - **Load balancing**: Configure multiple publishing points behind a load balancer for high-traffic scenarios ## Troubleshooting Common Issues - **Connection failures**: Verify firewall settings allow traffic on the streaming port (typically 80 or 443) - **Playback stuttering**: Check server resources and consider increasing buffer settings - **Mobile compatibility issues**: Ensure mobile format generation is enabled and test across multiple devices - **Quality issues**: Adjust encoding parameters and bitrate ladder configuration ## Conclusion IIS Smooth Streaming, when implemented with VisioForge SDKs, provides a robust solution for adaptive video delivery across diverse network conditions and devices. By following this comprehensive guide, you can configure, implement, and optimize Smooth Streaming for your .NET applications. For additional code samples and implementation examples, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). --- *This documentation is provided by VisioForge. For additional support or information about our SDKs, please visit [www.visioforge.com](https://www.visioforge.com).* ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\index.md --- title: Network Streaming Guide for .NET Development description: Learn how to implement RTMP, RTSP, HLS, and NDI streaming in .NET applications. Includes code examples for live broadcasting, hardware acceleration, and integration with major streaming platforms. sidebar_label: Network Streaming order: 16 --- # Comprehensive Network Streaming Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Network Streaming Network streaming enables real-time transmission of audio and video content across the internet or local networks. VisioForge's comprehensive SDKs provide powerful tools for implementing various streaming protocols in your .NET applications, allowing you to create professional-grade broadcasting solutions with minimal development effort. This guide covers all streaming options available in VisioForge SDKs, including implementation details, best practices, and code examples to help you select the most appropriate streaming technology for your specific requirements. ## Streaming Protocol Overview VisioForge SDKs support a wide range of streaming protocols, each with unique advantages for different use cases: ### Real-Time Protocols - **[RTMP (Real-Time Messaging Protocol)](rtmp.md)**: Industry-standard protocol for low-latency live streaming, widely used for live broadcasting to CDNs and streaming platforms - **[RTSP (Real-Time Streaming Protocol)](rtsp.md)**: Ideal for IP camera integration and surveillance applications, offering precise control over media sessions - **[SRT (Secure Reliable Transport)](srt.md)**: Advanced protocol designed for high-quality, low-latency video delivery over unpredictable networks - **[NDI (Network Device Interface)](ndi.md)**: Professional-grade protocol for high-quality, low-latency video transmission over local networks ### HTTP-Based Streaming - **[HLS (HTTP Live Streaming)](hls-streaming.md)**: Apple-developed protocol that breaks streams into downloadable segments, offering excellent compatibility with browsers and mobile devices - **[HTTP MJPEG Streaming](http-mjpeg.md)**: Simple implementation for streaming motion JPEG over HTTP connections - **[IIS Smooth Streaming](iis-smooth-streaming.md)**: Microsoft's adaptive streaming technology for delivering media through IIS servers ### Platform-Specific Solutions - **[Windows Media Streaming (WMV)](wmv.md)**: Microsoft's native streaming format, ideal for Windows-centric deployments - **[Adobe Flash Media Server](adobe-flash.md)**: Legacy streaming solution for Flash-based applications ### Cloud & Social Media Integration - **[AWS S3](aws-s3.md)**: Direct streaming to Amazon Web Services S3 storage - **[YouTube Live](youtube.md)**: Simplified integration with YouTube's live streaming platform - **[Facebook Live](facebook.md)**: Direct broadcasting to Facebook's streaming service ## Key Components of Network Streaming ### Video Encoders VisioForge SDKs provide multiple encoding options to balance quality, performance and compatibility: #### Software Encoders - **OpenH264**: Cross-platform software-based H.264 encoder - **AVENC H264**: FFmpeg-based software encoder #### Hardware-Accelerated Encoders - **NVENC H264/HEVC**: NVIDIA GPU-accelerated encoding - **QSV H264/HEVC**: Intel Quick Sync Video acceleration - **AMF H264/HEVC**: AMD GPU-accelerated encoding - **Apple Media H264**: macOS-specific hardware acceleration ## Best Practices for Network Streaming ### Performance Optimization 1. **Hardware acceleration**: Leverage GPU-based encoding where available for reduced CPU usage 2. **Resolution and framerate**: Match output to content type (60fps for gaming, 30fps for general content) 3. **Bitrate allocation**: Allocate 80-90% of bandwidth to video and 10-20% to audio ### Network Reliability 1. **Connection testing**: Verify upload speed before streaming 2. **Error handling**: Implement reconnection logic for disrupted streams 3. **Monitoring**: Track streaming metrics in real-time to identify issues ### Quality Assurance 1. **Pre-streaming checks**: Validate encoder settings and output parameters 2. **Quality monitoring**: Regularly check stream quality during broadcast 3. **Platform compliance**: Follow platform-specific requirements (YouTube, Facebook, etc.) ## Troubleshooting Common Issues 1. **Encoding overload**: If experiencing frame drops, reduce resolution or bitrate 2. **Connection failures**: Verify network stability and server addresses 3. **Audio/video sync**: Ensure proper timestamp synchronization between streams 4. **Platform rejection**: Confirm compliance with platform-specific requirements 5. **Hardware acceleration failures**: Verify driver installation and compatibility ## Conclusion Network streaming with VisioForge SDKs provides a comprehensive solution for implementing professional-grade media broadcasting in your .NET applications. By understanding the available protocols and following best practices, you can create high-quality streaming experiences for your users across multiple platforms. For protocol-specific implementation details, refer to the dedicated guides linked throughout this document. ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\ndi.md --- title: NDI Network Video Streaming Integration Guide description: Learn how to implement high-performance NDI streaming in .NET applications. Step-by-step guide for developers to set up low-latency video/audio transmission over IP networks with code examples and best practices. sidebar_label: NDI --- # Network Device Interface (NDI) Streaming Integration [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## What is NDI and Why Use It? The VisioForge SDK's integration of Network Device Interface (NDI) technology provides a transformative solution for professional video production and broadcasting workflows. NDI has emerged as a leading industry standard for live production, enabling high-quality, ultra-low-latency video streaming over conventional Ethernet networks. NDI significantly simplifies the process of sharing and managing multiple video streams across diverse devices and platforms. When implemented within the VisioForge SDK, it facilitates seamless transmission of high-definition video and audio content from servers to clients with exceptional performance characteristics. This makes the technology particularly valuable for applications including: - Live broadcasting and streaming - Professional video conferencing - Multi-camera production setups - Remote production workflows - Educational and corporate presentation environments The inherent flexibility and efficiency of NDI streaming technology substantially reduces dependency on specialized hardware configurations, delivering a cost-effective alternative to traditional SDI-based systems for professional-grade live video production. ## Installation Requirements ### Prerequisites for NDI Implementation To successfully implement NDI streaming functionality within your application, you must install one of the following official NDI software packages: 1. **[NDI SDK](https://ndi.video/download-ndi-sdk/)** - Recommended for developers who need comprehensive access to NDI functionality 2. **[NDI Tools](https://ndi.video/tools/)** - Suitable for basic implementation and testing scenarios These packages provide the necessary runtime components that enable NDI communication across your network infrastructure. ## Cross-Platform NDI Output Implementation [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ### Understanding the NDIOutput Class Architecture The `NDIOutput` class serves as the core implementation framework for NDI functionality within the VisioForge SDK ecosystem. This class encapsulates configuration properties and processing logic required for high-performance video-over-IP transmission using the NDI protocol. The architecture enables broadcast-quality video and audio transmission across standard network infrastructure without specialized hardware requirements. #### Class Definition and Interface Implementation ```csharp public class NDIOutput : IVideoEditXBaseOutput, IVideoCaptureXBaseOutput, IOutputVideoProcessor, IOutputAudioProcessor ``` The class implements several interfaces that provide comprehensive functionality for different output scenarios: - `IVideoEditXBaseOutput` - Provides integration with video editing workflows - `IVideoCaptureXBaseOutput` - Enables direct capture-to-NDI streaming capabilities - `IOutputVideoProcessor` - Allows for advanced video processing during output - `IOutputAudioProcessor` - Facilitates audio processing and manipulation in the NDI pipeline ### Configuration Properties #### Video Processing Pipeline ```csharp public MediaBlock CustomVideoProcessor { get; set; } ``` This property allows developers to extend the NDI streaming pipeline with custom video processing functionality. By assigning a custom `MediaBlock` implementation, you can integrate specialized video filters, transformations, or analysis algorithms before content is transmitted via NDI. #### Audio Processing Pipeline ```csharp public MediaBlock CustomAudioProcessor { get; set; } ``` Similar to the video processor property, this allows for insertion of custom audio processing logic. Common applications include dynamic audio level adjustment, noise reduction, or specialized audio effects that enhance the streaming experience. #### NDI Sink Configuration ```csharp public NDISinkSettings Sink { get; set; } ``` This property contains the comprehensive configuration parameters for the NDI output sink, including essential settings such as stream identification, compression options, and network transmission parameters. ### Constructor Overloads #### Basic Constructor with Stream Name ```csharp public NDIOutput(string name) ``` Creates a new NDI output instance with the specified stream name, which will identify this NDI source on the network. **Parameters:** - `name`: String identifier for the NDI stream visible to receivers on the network #### Advanced Constructor with Pre-configured Settings ```csharp public NDIOutput(NDISinkSettings settings) ``` Creates a new NDI output instance with comprehensive pre-configured sink settings for advanced implementation scenarios. **Parameters:** - `settings`: A fully configured `NDISinkSettings` object containing all required NDI configuration parameters ### Core Methods #### Stream Identification ```csharp public string GetFilename() ``` Returns the configured name of the NDI stream. This method maintains compatibility with file-based output interfaces in the SDK architecture. **Returns:** The current NDI stream identifier ```csharp public void SetFilename(string filename) ``` Updates the NDI stream identifier. This method is primarily used for compatibility with other output types that use filename-based identification. **Parameters:** - `filename`: The updated name for the NDI stream #### Encoder Management ```csharp public Tuple[] GetVideoEncoders() ``` Returns an empty array as NDI handles video encoding internally through its proprietary technology. **Returns:** Empty array of encoder tuples ```csharp public Tuple[] GetAudioEncoders() ``` Returns an empty array as NDI handles audio encoding internally through its proprietary technology. **Returns:** Empty array of encoder tuples ## Implementation Examples ### Media Blocks SDK Implementation The following example demonstrates how to configure an NDI output using the Media Blocks SDK architecture: ```cs // Create an NDI output block with a descriptive stream name var ndiSink = new NDISinkBlock("VisioForge Production Stream"); // Connect video source to the NDI output // CreateNewInput method establishes a video input channel for the NDI sink pipeline.Connect(videoSource.Output, ndiSink.CreateNewInput(MediaBlockPadMediaType.Video)); // Connect audio source to the NDI output // CreateNewInput method establishes an audio input channel for the NDI sink pipeline.Connect(audioSource.Output, ndiSink.CreateNewInput(MediaBlockPadMediaType.Audio)); ``` ### Video Capture SDK Implementation This example shows how to integrate NDI streaming within the Video Capture SDK framework: ```cs // Initialize NDI output with a network-friendly stream name var ndiOutput = new NDIOutput("VisioForge_Studio_Output"); // Add the configured NDI output to the video capture pipeline core.Outputs_Add(ndiOutput); // core represents the VideoCaptureCoreX instance ``` ## Windows-Specific NDI Implementation [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] For Windows-specific implementations, the SDK provides additional configuration options through the VideoCaptureCore or VideoEditCore components. ### Step-by-Step Implementation Guide #### 1. Enable Network Streaming First, activate the network streaming functionality: ```cs VideoCapture1.Network_Streaming_Enabled = true; ``` #### 2. Configure Audio Streaming Enable audio transmission alongside video content: ```cs VideoCapture1.Network_Streaming_Audio_Enabled = true; ``` #### 3. Select NDI Protocol Specify NDI as the preferred streaming format: ```csharp VideoCapture1.Network_Streaming_Format = NetworkStreamingFormat.NDI; ``` #### 4. Create and Configure NDI Output Initialize the NDI output with a descriptive name: ```cs var streamName = "VisioForge NDI Streamer"; var ndiOutput = new NDIOutput(streamName); ``` #### 5. Assign the Output Connect the configured NDI output to the video capture pipeline: ```cs VideoCapture1.Network_Streaming_Output = ndiOutput; ``` #### 6. Generate the NDI URL (Optional) For debugging or sharing purposes, you can generate the standard NDI protocol URL: ```cs string ndiUrl = $"ndi://{System.Net.Dns.GetHostName()}/{streamName}"; Debug.WriteLine(ndiUrl); ``` ## Advanced Integration Considerations When implementing NDI streaming in production environments, consider the following factors: - **Network bandwidth requirements** - NDI streams can consume significant bandwidth depending on resolution and framerate - **Quality vs. latency tradeoffs** - Configure appropriate compression settings based on your specific use case - **Multicast vs. unicast distribution** - Determine the optimal network transmission method based on your infrastructure - **Hardware acceleration options** - Leverage GPU acceleration where available for improved performance - **Discovery mechanism** - Consider how NDI sources will be discovered across network segments ## Related Components - **NDISinkSettings** - Provides detailed configuration options for the NDI output sink - **NDISinkBlock** - Implements the core NDI output functionality referenced in NDISinkSettings - **MediaBlockPadMediaType** - Enum used to specify the type of media (video or audio) for input connections --- Visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) for additional code samples and implementation examples. ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\rtmp.md --- title: RTMP Live Streaming for .NET Applications description: Learn how to implement RTMP streaming in .NET apps with practical code examples. Covers hardware acceleration, cross-platform support, error handling, and integration with popular streaming platforms like YouTube and Facebook Live. sidebar_label: RTMP --- # RTMP Streaming with VisioForge SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to RTMP Streaming RTMP (Real-Time Messaging Protocol) is a robust communication protocol designed for high-performance transmission of audio, video, and data between a server and a client. VisioForge SDKs provide comprehensive support for RTMP streaming, enabling developers to create powerful streaming applications with minimal effort. This guide covers implementation details for RTMP streaming across different VisioForge products, including cross-platform solutions and Windows-specific integrations. ## Cross-Platform RTMP Implementation [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The `RTMPOutput` class serves as the central configuration point for RTMP streaming in cross-platform scenarios. It implements multiple interfaces including `IVideoEditXBaseOutput` and `IVideoCaptureXBaseOutput`, making it versatile for both video editing and capture workflows. ### Setting Up RTMP Output To begin implementing RTMP streaming, you need to create and configure an `RTMPOutput` instance: ```csharp // Initialize with streaming URL var rtmpOutput = new RTMPOutput("rtmp://your-streaming-server/stream-key"); // Alternatively, set the URL after initialization var rtmpOutput = new RTMPOutput(); rtmpOutput.Sink.Location = "rtmp://your-streaming-server/stream-key"; ``` ### Integration with VisioForge SDKs #### Video Capture SDK Integration ```csharp // Add RTMP output to the Video Capture SDK engine core.Outputs_Add(rtmpOutput, true); // core is an instance of VideoCaptureCoreX ``` #### Video Edit SDK Integration ```csharp // Set RTMP as the output format for Video Edit SDK core.Output_Format = rtmpOutput; // core is an instance of VideoEditCoreX ``` #### Media Blocks SDK Integration ```csharp // Create an RTMP sink block var rtmpSink = new RTMPSinkBlock(new RTMPSinkSettings() { Location = "rtmp://streaming-server/stream" }); // Connect video and audio encoders to the RTMP sink pipeline.Connect(h264Encoder.Output, rtmpSink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, rtmpSink.CreateNewInput(MediaBlockPadMediaType.Audio)); ``` ## Video Encoder Configuration ### Supported Video Encoders VisioForge provides extensive support for various video encoders, making it possible to optimize streaming based on available hardware: - **OpenH264**: Default software encoder for most platforms - **NVENC H264**: Hardware-accelerated encoding for NVIDIA GPUs - **QSV H264**: Intel Quick Sync Video acceleration - **AMF H264**: AMD GPU-based acceleration - **HEVC/H265**: Various implementations including MF HEVC, NVENC HEVC, QSV HEVC, and AMF H265 ### Implementing Hardware-Accelerated Encoding For optimal performance, it's recommended to utilize hardware acceleration when available: ```csharp // Check for NVIDIA encoder availability and use if present if (NVENCH264EncoderSettings.IsAvailable()) { rtmpOutput.Video = new NVENCH264EncoderSettings(); } // Fall back to OpenH264 if hardware acceleration isn't available else { rtmpOutput.Video = new OpenH264EncoderSettings(); } ``` ## Audio Encoder Configuration ### Supported Audio Encoders The SDK supports multiple AAC encoder implementations: - **VO-AAC**: Default for non-Windows platforms - **AVENC AAC**: Cross-platform implementation - **MF AAC**: Default for Windows platforms ```csharp // Configure MF AAC encoder on Windows platforms rtmpOutput.Audio = new MFAACEncoderSettings(); // For macOS or other platforms rtmpOutput.Audio = new VOAACEncoderSettings(); ``` ## Platform-Specific Considerations ### Windows Implementation On Windows platforms, the default configuration uses: - OpenH264 for video encoding - MF AAC for audio encoding Additionally, Windows supports Microsoft Media Foundation HEVC encoding for high-efficiency streaming. ### macOS Implementation For macOS applications, the system uses: - AppleMediaH264EncoderSettings for video encoding - VO-AAC for audio encoding ### Automatic Platform Detection The SDK handles platform differences automatically through conditional compilation: ```csharp #if __MACOS__ Video = new AppleMediaH264EncoderSettings(); #else Video = new OpenH264EncoderSettings(); #endif ``` ## Best Practices for RTMP Streaming ### 1. Encoder Selection Strategy Always verify encoder availability before attempting to use hardware acceleration: ```csharp // Check for Intel Quick Sync availability if (QSVH264EncoderSettings.IsAvailable()) { rtmpOutput.Video = new QSVH264EncoderSettings(); } // Check for NVIDIA acceleration else if (NVENCH264EncoderSettings.IsAvailable()) { rtmpOutput.Video = new NVENCH264EncoderSettings(); } // Fall back to software encoding else { rtmpOutput.Video = new OpenH264EncoderSettings(); } ``` ### 2. Error Handling Implement robust error handling to manage streaming failures gracefully: ```csharp try { var rtmpOutput = new RTMPOutput(streamUrl); // Configure and start streaming } catch (Exception ex) { logger.LogError($"RTMP streaming initialization failed: {ex.Message}"); // Implement appropriate error recovery } ``` ### 3. Resource Management Ensure proper disposal of resources when streaming is complete: ```csharp // In your cleanup routine if (rtmpOutput != null) { rtmpOutput.Dispose(); rtmpOutput = null; } ``` ## Advanced RTMP Configuration ### Dynamic Encoder Selection For applications that need to adapt to different environments, you can enumerate available encoders: ```csharp var rtmpOutput = new RTMPOutput(); var availableVideoEncoders = rtmpOutput.GetVideoEncoders(); var availableAudioEncoders = rtmpOutput.GetAudioEncoders(); // Present options to users or select based on system capabilities ``` ### Custom Sink Configuration Fine-tune streaming parameters using the RTMPSinkSettings class: ```csharp rtmpOutput.Sink = new RTMPSinkSettings { Location = "rtmp://streaming-server/stream" }; ``` ## Windows-Specific RTMP Implementation [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] For Windows-only applications, VisioForge provides an alternative implementation using FFmpeg: ```csharp // Enable network streaming VideoCapture1.Network_Streaming_Enabled = true; // Set streaming format to RTMP using FFmpeg VideoCapture1.Network_Streaming_Format = NetworkStreamingFormat.RTMP_FFMPEG_EXE; // Create and configure FFmpeg output var ffmpegOutput = new FFMPEGEXEOutput(); ffmpegOutput.FillDefaults(DefaultsProfile.MP4_H264_AAC, true); ffmpegOutput.OutputMuxer = OutputMuxer.FLV; // Assign output to the capture component VideoCapture1.Network_Streaming_Output = ffmpegOutput; // Enable audio streaming (required for many services) VideoCapture1.Network_Streaming_Audio_Enabled = true; ``` ## Streaming to Popular Platforms ### YouTube Live ```csharp // Format: rtmp://a.rtmp.youtube.com/live2/ + [YouTube stream key] VideoCapture1.Network_Streaming_URL = "rtmp://a.rtmp.youtube.com/live2/xxxx-xxxx-xxxx-xxxx"; ``` ### Facebook Live ```csharp // Format: rtmps://live-api-s.facebook.com:443/rtmp/ + [Facebook stream key] VideoCapture1.Network_Streaming_URL = "rtmps://live-api-s.facebook.com:443/rtmp/xxxx-xxxx-xxxx-xxxx"; ``` ### Custom RTMP Servers ```csharp // Connect to any RTMP server VideoCapture1.Network_Streaming_URL = "rtmp://your-streaming-server:1935/live/stream"; ``` ## Performance Optimization To achieve optimal streaming performance: 1. **Use hardware acceleration** when available to reduce CPU load 2. **Monitor resource usage** during streaming to identify bottlenecks 3. **Adjust resolution and bitrate** based on available bandwidth 4. **Implement adaptive bitrate** for varying network conditions 5. **Consider GOP size** and keyframe intervals for streaming quality ## Troubleshooting Common Issues - **Connection Failures**: Verify server URL format and network connectivity - **Encoder Errors**: Confirm hardware encoder availability and drivers - **Performance Issues**: Monitor CPU/GPU usage and adjust encoding parameters - **Audio/Video Sync**: Check timestamp synchronization settings ## Conclusion VisioForge's RTMP implementation provides developers with a powerful, flexible framework for creating robust streaming applications. By leveraging the appropriate SDK components and following the best practices outlined in this guide, you can create high-performance streaming solutions that work across platforms and integrate with popular streaming services. ## Related Resources - [Streaming to Adobe Flash Media Server](adobe-flash.md) - [YouTube Streaming Integration](youtube.md) - [Facebook Live Implementation](facebook.md) ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\rtsp.md --- title: RTSP Video Streaming Implementation in .NET description: Learn how to implement RTSP streaming in .NET applications with hardware acceleration, cross-platform support, and best practices. Master video encoding, server configuration, and real-time streaming for security cameras and live broadcasting. sidebar_label: RTSP Streaming --- # Mastering RTSP Streaming with VisioForge SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to RTSP The Real-Time Streaming Protocol (RTSP) is a network control protocol designed for use in entertainment and communications systems to control streaming media servers. It acts like a "network remote control," allowing users to play, pause, and stop media streams. VisioForge SDKs harness the power of RTSP to provide robust video and audio streaming capabilities. Our SDKs integrate RTSP with industry-standard codecs like **H.264 (AVC)** for video and **Advanced Audio Coding (AAC)** for audio. H.264 offers excellent video quality at relatively low bitrates, making it ideal for streaming over various network conditions. AAC provides efficient and high-fidelity audio compression. This powerful combination ensures reliable, high-definition audiovisual streaming suitable for demanding applications such as: * **Security and Surveillance:** Delivering clear, real-time video feeds from IP cameras. * **Live Broadcasting:** Streaming events, webinars, or performances to a wide audience. * **Video Conferencing:** Enabling smooth, high-quality communication. * **Remote Monitoring:** Observing industrial processes or environments remotely. This guide delves into the specifics of implementing RTSP streaming using VisioForge SDKs, covering both modern cross-platform approaches and legacy Windows-specific methods. ## Cross-Platform RTSP Output (Recommended) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The modern VisioForge SDKs (`CoreX` versions and Media Blocks) provide a flexible and powerful cross-platform RTSP server implementation built upon the robust GStreamer framework. This approach offers greater control, wider codec support, and compatibility across Windows, Linux, macOS, and other platforms. ### Core Component: `RTSPServerOutput` The `RTSPServerOutput` class is the central configuration point for establishing an RTSP stream within the Video Capture or Video Edit SDKs (`CoreX` versions). It acts as a bridge between your capture/edit pipeline and the underlying RTSP server logic. **Key Responsibilities:** * **Interface Implementation:** Implements `IVideoEditXBaseOutput` and `IVideoCaptureXBaseOutput`, allowing seamless integration as an output format in both editing and capture scenarios. * **Settings Management:** Holds the `RTSPServerSettings` object, which contains all the detailed configuration parameters for the server instance. * **Codec Specification:** Defines the video and audio encoders that will be used to compress the media before streaming. **Supported Encoders:** VisioForge provides access to a wide array of encoders, allowing optimization based on hardware capabilities and target platforms: * **Video Encoders:** * **Hardware-Accelerated (Recommended for performance):** * `NVENC` (NVIDIA): Leverages dedicated encoding hardware on NVIDIA GPUs. * `QSV` (Intel Quick Sync Video): Utilizes integrated GPU capabilities on Intel processors. * `AMF` (AMD Advanced Media Framework): Uses encoding hardware on AMD GPUs/APUs. * **Software-Based (Platform-independent, higher CPU usage):** * `OpenH264`: A widely compatible H.264 software encoder. * `VP8` / `VP9`: Royalty-free video codecs developed by Google, offering good compression (often used with WebRTC, but available here). * **Platform-Specific:** * `MF HEVC` (Media Foundation HEVC): Windows-specific H.265/HEVC encoder for higher efficiency compression. * **Audio Encoders:** * **AAC Variants:** * `VO-AAC`: A versatile, cross-platform AAC encoder. * `AVENC AAC`: Utilizes FFmpeg's AAC encoder. * `MF AAC`: Windows Media Foundation AAC encoder. * **Other Formats:** * `MP3`: Widely compatible but less efficient than AAC. * `OPUS`: Excellent low-latency codec, ideal for interactive applications. ### Configuring the Stream: `RTSPServerSettings` This class encapsulates all the parameters needed to define the behavior and properties of your RTSP server. **Detailed Properties:** * **Network Configuration:** * `Port` (int): The TCP port the server listens on for incoming RTSP connections. The default is `8554`, a common alternative to the standard (often restricted) port 554. Ensure this port is open in firewalls. * `Address` (string): The IP address the server binds to. * `"127.0.0.1"` (Default): Listens only for connections from the local machine. * `"0.0.0.0"`: Listens on all available network interfaces (use for public access). * Specific IP (e.g., `"192.168.1.100"`): Binds only to that specific network interface. * `Point` (string): The path component of the RTSP URL (e.g., `/live`, `/stream1`). Clients will connect to `rtsp://
:`. Default is `"/live"`. * **Stream Configuration:** * `VideoEncoder` (IVideoEncoderSettings): An instance of a video encoder settings class (e.g., `OpenH264EncoderSettings`, `NVEncoderSettings`). This defines the codec, bitrate, quality, etc. * `AudioEncoder` (IAudioEncoderSettings): An instance of an audio encoder settings class (e.g., `VOAACEncoderSettings`). Defines audio codec parameters. * `Latency` (TimeSpan): Controls the buffering delay introduced by the server to smooth out network jitter. Default is 250 milliseconds. Higher values increase stability but also delay. * **Authentication:** * `Username` (string): If set, clients must provide this username for basic authentication. * `Password` (string): If set, clients must provide this password along with the username. * **Server Identity:** * `Name` (string): A friendly name for the server, sometimes displayed by client applications. * `Description` (string): A more detailed description of the stream content or server purpose. * **Convenience Property:** * `URL` (Uri): Automatically constructs the full RTSP connection URL based on the `Address`, `Port`, and `Point` properties. ### The Engine: `RTSPServerBlock` (Media Blocks SDK) When using the Media Blocks SDK, the `RTSPServerBlock` represents the actual GStreamer-based element that performs the streaming. **Functionality:** * **Media Sink:** Acts as a terminal point (sink) in a media pipeline, receiving encoded video and audio data. * **Input Pads:** Provides distinct `VideoInput` and `AudioInput` pads for connecting upstream video and audio sources/encoders. * **GStreamer Integration:** Manages the underlying GStreamer `rtspserver` and related elements necessary for handling client connections and streaming RTP packets. * **Availability Check:** The static `IsAvailable()` method allows checking if the necessary GStreamer plugins for RTSP streaming are present on the system. * **Resource Management:** Implements `IDisposable` for proper cleanup of network sockets and GStreamer resources when the block is no longer needed. ### Practical Usage Examples #### Example 1: Basic Server Setup (VideoCaptureCoreX / VideoEditCoreX) ```csharp // 1. Choose and configure encoders // Use hardware acceleration if available, otherwise fallback to software var videoEncoder = H264EncoderBlock.GetDefaultSettings(); var audioEncoder = new VOAACEncoderSettings(); // Reliable cross-platform AAC // 2. Configure server network settings var settings = new RTSPServerSettings(videoEncoder, audioEncoder) { Port = 8554, Address = "0.0.0.0", // Accessible from other machines on the network Point = "/livefeed" }; // 3. Create the output object var rtspOutput = new RTSPServerOutput(settings); // 4. Integrate with the SDK engine // For VideoCaptureCoreX: // videoCapture is an initialized instance of VideoCaptureCoreX videoCapture.Outputs_Add(rtspOutput); // For VideoEditCoreX: // videoEdit is an initialized instance of VideoEditCoreX // videoEdit.Output_Format = rtspOutput; // Set before starting editing/playback ``` #### Example 2: Media Blocks Pipeline ```csharp // Assume 'pipeline' is an initialized MediaBlocksPipeline // Assume 'videoSource' and 'audioSource' provide unencoded media streams // 1. Create video and audio encoder settings var videoEncoder = H264EncoderBlock.GetDefaultSettings(); var audioEncoder = new VOAACEncoderSettings(); // 2. Create RTSP server settings with a specific URL var serverUri = new Uri("rtsp://192.168.1.50:8554/cam1"); var rtspSettings = new RTSPServerSettings(serverUri, videoEncoder, audioEncoder) { Description = "Camera Feed 1 - Warehouse" }; // 3. Create the RTSP Server Block if (!RTSPServerBlock.IsAvailable()) { Console.WriteLine("RTSP Server components not available. Check GStreamer installation."); return; } var rtspSink = new RTSPServerBlock(rtspSettings); // Connect source directly to RTSP server block, because server block will use its own encoders pipeline.Connect(videoSource.Output, rtspSink.VideoInput); // Connect source directly to video input of RTSP server block pipeline.Connect(audioSource.Output, rtspSink.AudioInput); // Connect source directly to audio input of RTSP server block Start the pipeline... await pipeline.StartAsync(); ``` #### Example 3: Advanced Configuration with Authentication ```csharp // Using settings from Example 1... var secureSettings = new RTSPServerSettings(videoEncoder, audioEncoder) { Port = 8555, // Use a different port Address = "192.168.1.100", // Bind to a specific internal IP Point = "/secure", Username = "viewer", Password = "VerySecretPassword!", Latency = TimeSpan.FromMilliseconds(400), // Slightly higher latency Name = "SecureStream", Description = "Authorized access only" }; var secureRtspOutput = new RTSPServerOutput(secureSettings); // Add to VideoCaptureCoreX or set for VideoEditCoreX as before // videoCapture.Outputs_Add(secureRtspOutput); ``` ### Streaming Best Practices 1. **Encoder Selection Strategy:** * **Prioritize Hardware:** Always prefer hardware encoders (NVENC, QSV, AMF) when available on the target system. They drastically reduce CPU load, allowing for higher resolutions, frame rates, or more simultaneous streams. * **Software Fallback:** Use `OpenH264` as a reliable software fallback for broad compatibility when hardware acceleration isn't present or suitable. * **Codec Choice:** H.264 remains the most widely compatible codec for RTSP clients. HEVC offers better compression but client support might be less universal. 2. **Latency Tuning:** * **Interactivity vs. Stability:** Lower latency (e.g., 100-200ms) is crucial for applications like video conferencing but makes the stream more susceptible to network hiccups. * **Broadcast/Surveillance:** Higher latency (e.g., 500ms-1000ms+) provides larger buffers, improving stream resilience over unstable networks (like Wi-Fi or the internet) at the cost of increased delay. Start with the default (250ms) and adjust based on observed stream quality and requirements. 3. **Network Configuration:** * **Security First:** Implement `Username` and `Password` authentication for any stream not intended for public anonymous access. * **Binding Address:** Use `"0.0.0.0"` cautiously. For enhanced security, bind explicitly to the network interface (`Address`) intended for client connections if possible. * **Firewall Rules:** Meticulously configure system and network firewalls to allow incoming TCP connections on the chosen RTSP `Port`. Also, remember that RTP/RTCP (used for the actual media data) often use dynamic UDP ports; firewalls might need helper modules (like `nf_conntrack_rtsp` on Linux) or broad UDP port ranges opened (less secure). 4. **Resource Management:** * **Dispose Pattern:** RTSP server instances hold network resources (sockets) and potentially complex GStreamer pipelines. *Always* ensure they are disposed of correctly using `using` statements or explicit `.Dispose()` calls in `finally` blocks to prevent resource leaks. * **Graceful Shutdown:** When stopping the capture or edit process, ensure the output is properly removed or the pipeline is stopped cleanly to allow the RTSP server to shut down gracefully. ### Performance Considerations Optimizing RTSP streaming involves balancing quality, latency, and resource usage: 1. **Encoder Impact:** This is often the biggest factor. * **Hardware:** Significantly lower CPU usage, higher potential throughput. Requires compatible hardware and drivers. * **Software:** High CPU load, especially at higher resolutions/framerates. Limits the number of concurrent streams on a single machine but works universally. 2. **Latency vs. Bandwidth:** Lower latency settings can sometimes lead to increased peak bandwidth usage as the system has less time to smooth out data transmission. 3. **Resource Monitoring:** * **CPU:** Keep a close eye on CPU usage, particularly with software encoders. Overload leads to dropped frames and stuttering. * **Memory:** Monitor RAM usage, especially if handling multiple streams or complex Media Blocks pipelines. * **Network:** Ensure the server's network interface has sufficient bandwidth for the configured bitrate, resolution, and number of connected clients. Calculate required bandwidth (Video Bitrate + Audio Bitrate) * Number of Clients. ## Windows-Only RTSP Output (Legacy) [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The implementation includes several error handling mechanisms: Older versions of the SDK (`VideoCaptureCore`, `VideoEditCore`) included a simpler, Windows-specific RTSP output mechanism. While functional, it offers less flexibility and codec support compared to the cross-platform `RTSPServerOutput`. **It is generally recommended to use the `CoreX` / Media Blocks approach for new projects.** ### How it Works This method leverages built-in Windows components or specific bundled filters. Configuration is done directly via properties on the `VideoCaptureCore` or `VideoEditCore` object. ### Sample Configuration Code ```csharp // Assuming VideoCapture1 is an instance of VisioForge.Core.VideoCapture.VideoCaptureCore // 1. Enable network streaming globally for the component VideoCapture1.Network_Streaming_Enabled = true; // 2. Specifically enable audio streaming (optional, default might be true) VideoCapture1.Network_Streaming_Audio_Enabled = true; // 3. Select the desired RTSP format. // RTSP_H264_AAC_SW indicates software encoding for both H.264 and AAC. // Other options might exist depending on installed filters/components. VideoCapture1.Network_Streaming_Format = VisioForge.Types.VFNetworkStreamingFormat.RTSP_H264_AAC_SW; // 4. Configure Encoder Settings (using MP4Output as a container) // Even though we aren't creating an MP4 file, the MP4Output class // is used here to hold H.264 and AAC encoder settings. var mp4OutputSettings = new VisioForge.Types.Output.MP4Output(); // Configure H.264 settings within mp4OutputSettings // (Specific properties depend on the SDK version, e.g., bitrate, profile) // mp4OutputSettings.Video_H264... = ...; // Configure AAC settings within mp4OutputSettings // (e.g., bitrate, sample rate) // mp4OutputSettings.Audio_AAC... = ...; // 5. Assign the settings container to the network streaming output VideoCapture1.Network_Streaming_Output = mp4OutputSettings; // 6. Define the RTSP URL clients will use // The server will automatically listen on the specified port (5554 here). VideoCapture1.Network_Streaming_URL = "rtsp://localhost:5554/vfstream"; // Use machine's actual IP instead of localhost for external access. // After configuration, start the capture/playback as usual // VideoCapture1.Start(); ``` **Note:** This legacy method often relies on DirectShow filters or Media Foundation transforms available on the specific Windows system, making it less predictable and portable than the GStreamer-based cross-platform solution. --- For more detailed examples and advanced use cases, explore the code samples provided in our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\srt.md --- title: Implementing SRT Protocol Streaming in .NET description: Learn how to integrate SRT (Secure Reliable Transport) protocol for low-latency video streaming in .NET applications. Includes code examples, hardware acceleration options, and best practices for reliable video delivery. sidebar_label: SRT --- # SRT Streaming Implementation Guide for VisioForge .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## What is SRT and Why Should You Use It? SRT (Secure Reliable Transport) is a high-performance streaming protocol designed for delivering high-quality, low-latency video across unpredictable networks. Unlike traditional streaming protocols, SRT excels in challenging network conditions by incorporating unique error recovery mechanisms and encryption features. The VisioForge .NET SDKs provide comprehensive support for SRT streaming through an intuitive configuration API, enabling developers to implement secure, reliable video delivery in their applications with minimal effort. ## Getting Started with SRT in VisioForge ### Supported SDK Platforms [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ### Basic SRT Configuration Implementing SRT streaming in your application starts with specifying your streaming destination URL. The SRT URL follows a standard format that includes protocol, host, and port information. #### Video Capture SDK Implementation ```csharp // Initialize SRT output with destination URL var srtOutput = new SRTOutput("srt://streaming-server:1234"); // Add the configured SRT output to your capture engine videoCapture.Outputs_Add(srtOutput, true); // videoCapture is an instance of VideoCaptureCoreX ``` #### Media Blocks SDK Implementation ```csharp // Create an SRT sink block with appropriate settings var srtSink = new SRTMPEGTSSinkBlock(new SRTSinkSettings() { Uri = "srt://:8888" }); // Configure encoders for SRT compatibility h264Encoder.Settings.ParseStream = false; // Disable parsing for H264 encoder // Connect your video encoder to the SRT sink pipeline.Connect(h264Encoder.Output, srtSink.CreateNewInput(MediaBlockPadMediaType.Video)); // Connect your audio encoder to the SRT sink pipeline.Connect(aacEncoder.Output, srtSink.CreateNewInput(MediaBlockPadMediaType.Audio)); ``` ## Video Encoding Options for SRT Streaming The VisioForge SDKs offer flexible encoding options to balance quality, performance, and hardware utilization. You can choose from software-based encoders or hardware-accelerated options based on your specific requirements. ### Software-Based Video Encoders - **OpenH264**: The default cross-platform encoder that provides excellent compatibility across different environments ### Hardware-Accelerated Video Encoders - **NVIDIA NVENC (H.264/HEVC)**: Leverages NVIDIA GPU acceleration for high-performance encoding - **Intel Quick Sync Video (H.264/HEVC)**: Utilizes Intel's dedicated media processing hardware - **AMD AMF (H.264/H.265)**: Enables hardware acceleration on AMD graphics processors - **Microsoft Media Foundation HEVC**: Windows-specific hardware-accelerated encoder #### Example: Configuring NVIDIA Hardware Acceleration ```csharp // Set SRT output to use NVIDIA hardware acceleration srtOutput.Video = new NVENCH264EncoderSettings(); ``` ## Audio Encoding for SRT Streams Audio quality is critical for many streaming applications. The VisioForge SDKs provide multiple audio encoding options: - **VO-AAC**: Cross-platform AAC encoder with consistent performance - **AVENC AAC**: FFmpeg-based AAC encoder with extensive configuration options - **MF AAC**: Microsoft Media Foundation AAC encoder (Windows-only) The SDK automatically selects the most appropriate default audio encoder based on the platform: - Windows systems default to MF AAC - Other platforms default to VO AAC ## Platform-Specific Optimizations ### Windows-Specific Features When running on Windows systems, the SDK can leverage Microsoft Media Foundation frameworks: - MF AAC encoder provides efficient audio encoding - MF HEVC encoder delivers high-quality, efficient video compression ### macOS Optimizations On macOS platforms, the SDK automatically selects: - Apple Media H264 encoder for optimized video encoding - VO AAC encoder for reliable audio encoding ## Advanced SRT Configuration Options ### Custom Media Processing Pipeline For applications with specialized requirements, the SDK supports custom processing for both video and audio streams: ```csharp // Add custom video processing before encoding srtOutput.CustomVideoProcessor = new SomeMediaBlock(); // Add custom audio processing before encoding srtOutput.CustomAudioProcessor = new SomeMediaBlock(); ``` These processors enable you to implement filters, transformations, or analytics before encoding and transmission. ### SRT Sink Configuration Fine-tune your SRT connection using the SRTSinkSettings class: ```csharp // Update the SRT destination URI srtOutput.Sink.Uri = "srt://new-server:5678"; ``` ## Best Practices for SRT Streaming ### Optimizing Encoder Selection 1. **Hardware Acceleration Priority**: Always choose hardware-accelerated encoders when available. The performance benefits are significant, particularly for high-resolution streaming. 2. **Smart Fallback Mechanisms**: Implement encoder availability checks to automatically fall back to software encoding if hardware acceleration is unavailable: ```csharp if (NVENCH264EncoderSettings.IsAvailable()) { srtOutput.Video = new NVENCH264EncoderSettings(); } else { srtOutput.Video = new OpenH264EncoderSettings(); } ``` ### Performance Optimization 1. **Bitrate Configuration**: Carefully adjust encoder bitrates based on your content type and target network conditions. Higher bitrates increase quality but require more bandwidth. 2. **Resource Monitoring**: Monitor CPU and GPU usage during streaming to identify potential bottlenecks. If CPU usage is consistently high, consider switching to hardware acceleration. 3. **Latency Management**: Configure appropriate buffer sizes based on your latency requirements. Smaller buffers reduce latency but may increase susceptibility to network fluctuations. ## Troubleshooting SRT Implementations ### Common Issues and Solutions #### Encoder Initialization Failures - **Problem**: Selected encoder fails to initialize or throws exceptions - **Solution**: Verify the encoder is supported on your platform and that required drivers are installed and up-to-date #### Streaming Connection Problems - **Problem**: Unable to establish SRT connection - **Solution**: Confirm the SRT URL format is correct and that specified ports are open in all firewalls and network equipment #### Performance Bottlenecks - **Problem**: High CPU usage or dropped frames during streaming - **Solution**: Consider switching to hardware-accelerated encoders or reducing resolution/bitrate ## Integration Examples ### Complete SRT Streaming Setup ```csharp // Create and configure SRT output var srtOutput = new SRTOutput("srt://streaming-server:1234"); // Configure video encoding - try hardware acceleration with fallback if (NVENCH264EncoderSettings.IsAvailable()) { var nvencSettings = new NVENCH264EncoderSettings(); nvencSettings.Bitrate = 4000000; // 4 Mbps srtOutput.Video = nvencSettings; } else { var softwareSettings = new OpenH264EncoderSettings(); softwareSettings.Bitrate = 2000000; // 2 Mbps for software encoding srtOutput.Video = softwareSettings; } // Add to capture engine videoCapture.Outputs_Add(srtOutput, true); // Start streaming videoCapture.Start(); ``` ## Conclusion SRT streaming in VisioForge .NET SDKs provides a powerful solution for high-quality, low-latency video delivery across challenging network conditions. By leveraging the flexible encoder options and configuration capabilities, developers can implement robust streaming solutions for a wide range of applications. Whether you're building a live streaming platform, video conferencing solution, or content delivery system, the SRT protocol's combination of security, reliability, and performance makes it an excellent choice for modern video applications. For more information about specific encoders or advanced configuration options, refer to the comprehensive VisioForge SDK documentation. ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\udp.md --- title: UDP Video and Audio Streaming in .NET description: Learn how to implement high-performance UDP streaming for video and audio in .NET applications. Detailed guide covers encoding, configuration, multicast support, and best practices for real-time media transmission. sidebar_label: UDP --- # UDP Streaming with VisioForge SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to UDP Streaming The User Datagram Protocol (UDP) is a lightweight, connectionless transport protocol that provides a simple interface between network applications and the underlying IP network. Unlike TCP, UDP offers minimal overhead and doesn't guarantee packet delivery, making it ideal for real-time applications where speed is crucial and occasional packet loss is acceptable. VisioForge SDKs offer robust support for UDP streaming, enabling developers to implement high-performance, low-latency streaming solutions for various applications, including live broadcasts, video surveillance, and real-time communication systems. ## Key Features and Capabilities The VisioForge SDK suite provides comprehensive UDP streaming functionality with the following key features: ### Video and Audio Codec Support - **Video Codecs**: Full support for H.264 (AVC) and H.265 (HEVC), offering excellent compression efficiency while maintaining high video quality. - **Audio Codec**: Advanced Audio Coding (AAC) support, providing superior audio quality at lower bitrates compared to older audio codecs. ### MPEG Transport Stream (MPEG-TS) The SDK utilizes MPEG-TS as the container format for UDP streaming. MPEG-TS offers several advantages: - Designed specifically for transmission over potentially unreliable networks - Built-in error correction capabilities - Support for multiplexing multiple audio and video streams - Low latency characteristics ideal for live streaming ### FFMPEG Integration VisioForge SDKs leverage the power of FFMPEG for UDP streaming, ensuring: - High performance encoding and streaming - Wide compatibility with various networks and receiving clients - Reliable packet handling and stream management ### Unicast and Multicast Support - **Unicast**: Point-to-point transmission from a single sender to a single receiver - **Multicast**: Efficient distribution of the same content to multiple recipients simultaneously without duplicating bandwidth at the source ## Technical Implementation Details UDP streaming in VisioForge SDKs involves several key technical components: 1. **Video Encoding**: Source video is compressed using H.264 or HEVC encoders with configurable parameters for bitrate, resolution, and frame rate. 2. **Audio Encoding**: Audio streams are processed through AAC encoders with adjustable quality settings. 3. **Multiplexing**: Video and audio streams are combined into a single MPEG-TS container. 4. **Packetization**: The MPEG-TS stream is divided into UDP packets of appropriate size for network transmission. 5. **Transmission**: Packets are sent over the network to specified unicast or multicast addresses. The implementation prioritizes low latency while maintaining sufficient quality for professional applications. Advanced buffering mechanisms help manage network jitter and ensure smooth playback at the receiving end. ## Windows-only UDP Output Implementation [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] ### Step 1: Enable Network Streaming The first step is to enable network streaming functionality in your application. This is done by setting the `Network_Streaming_Enabled` property to true: ```cs VideoCapture1.Network_Streaming_Enabled = true; ``` ### Step 2: Configure Audio Streaming (Optional) If your application requires audio streaming alongside video, enable it with: ```cs VideoCapture1.Network_Streaming_Audio_Enabled = true; ``` ### Step 3: Set the Streaming Format Specify UDP as the streaming format by setting the `Network_Streaming_Format` property to `UDP_FFMPEG_EXE`: ```cs VideoCapture1.Network_Streaming_Format = NetworkStreamingFormat.UDP_FFMPEG_EXE; ``` ### Step 4: Configure the UDP Stream URL Set the destination URL for your UDP stream. For a basic unicast stream to localhost: ```cs VideoCapture1.Network_Streaming_URL = "udp://127.0.0.1:10000?pkt_size=1316"; ``` The `pkt_size` parameter defines the UDP packet size. The value 1316 is optimized for most network environments, allowing for efficient transmission while minimizing fragmentation. ### Step 5: Multicast Configuration (Optional) For multicast streaming to multiple receivers, use a multicast address (typically in the range 224.0.0.0 to 239.255.255.255): ```cs VideoCapture1.Network_Streaming_URL = "udp://239.101.101.1:1234?ttl=1&pkt_size=1316"; ``` The additional parameters include: - **ttl**: Time-to-live value that determines how many network hops the packets can traverse - **pkt_size**: Packet size as explained above ### Step 6: Configure Output Settings Finally, configure the streaming output parameters using the `FFMPEGEXEOutput` class: ```cs var ffmpegOutput = new FFMPEGEXEOutput(); ffmpegOutput.FillDefaults(DefaultsProfile.MP4_H264_AAC, true); ffmpegOutput.OutputMuxer = OutputMuxer.MPEGTS; VideoCapture1.Network_Streaming_Output = ffmpegOutput; ``` This code: 1. Creates a new FFMPEG output configuration 2. Applies default settings for H.264 video and AAC audio 3. Specifies MPEG-TS as the container format 4. Assigns this configuration to the streaming output ## Advanced Configuration Options ### Bitrate Management For optimal streaming performance, consider adjusting the video and audio bitrates based on your network capacity: ```cs ffmpegOutput.VideoSettings.Bitrate = 2500000; // 2.5 Mbps for video ffmpegOutput.AudioSettings.Bitrate = 128000; // 128 kbps for audio ``` ### Resolution and Frame Rate Lower resolutions and frame rates reduce bandwidth requirements: ```cs VideoCapture1.Video_Resize_Enabled = true; VideoCapture1.Video_Resize_Width = 1280; // 720p resolution VideoCapture1.Video_Resize_Height = 720; VideoCapture1.Video_FrameRate = 30; // 30 fps ``` ### Buffer Size Configuration Adjusting buffer sizes can help manage latency vs. stability trade-offs: ```cs VideoCapture1.Network_Streaming_BufferSize = 8192; // in KB ``` ## Best Practices for UDP Streaming ### Network Considerations 1. **Bandwidth Assessment**: Ensure sufficient bandwidth for your target quality. As a guideline: - SD quality (480p): 1-2 Mbps - HD quality (720p): 2.5-4 Mbps - Full HD (1080p): 4-8 Mbps 2. **Network Stability**: UDP doesn't guarantee packet delivery. In unstable networks, consider: - Reducing resolution or bitrate - Implementing application-level error recovery - Using forward error correction when available 3. **Firewall Configuration**: Ensure that UDP ports are open on both sender and receiver firewalls. ### Performance Optimization 1. **Hardware Acceleration**: When available, enable hardware acceleration for encoding: ```cs ffmpegOutput.VideoSettings.HWAcceleration = HWAcceleration.Auto; ``` 2. **Keyframe Intervals**: For lower latency, reduce keyframe (I-frame) intervals: ```cs ffmpegOutput.VideoSettings.KeyframeInterval = 60; // One keyframe every 2 seconds at 30 fps ``` 3. **Preset Selection**: Choose encoding presets based on your CPU capacity and latency requirements: ```cs ffmpegOutput.VideoSettings.EncoderPreset = H264EncoderPreset.Ultrafast; // Lowest latency, higher bitrate // or ffmpegOutput.VideoSettings.EncoderPreset = H264EncoderPreset.Medium; // Balance between quality and CPU load ``` ## Troubleshooting Common Issues 1. **Stream Not Receiving**: Verify network connectivity, port availability, and firewall settings. 2. **High Latency**: Check network congestion, reduce bitrate, or adjust buffer sizes. 3. **Poor Quality**: Increase bitrate, adjust encoding settings, or check for network packet loss. 4. **Audio/Video Sync Issues**: Ensure proper timestamp synchronization in your application. ## Conclusion UDP streaming with VisioForge SDKs provides a powerful solution for real-time video and audio transmission with minimal latency. By leveraging H.264/HEVC video codecs, AAC audio, and MPEG-TS packaging, developers can create robust streaming applications suitable for a wide range of use cases. The flexibility of the SDK allows for fine-tuning of all streaming parameters, enabling optimization for specific network conditions and quality requirements. Whether implementing a simple point-to-point stream or a complex multicast distribution system, VisioForge's UDP streaming capabilities provide the necessary tools for success. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples and working demonstrations of UDP streaming implementations. ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\wmv.md --- title: WMV Network Streaming with .NET Development description: Learn how to implement Windows Media Video (WMV) streaming in .NET applications. Step-by-step guide for developers covering setup, configuration, client connections, and performance optimization for network video streaming. sidebar_label: Windows Media Video --- # Windows Media Video (WMV) Network Streaming Implementation Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCore"] ## Introduction to WMV Streaming Technology Windows Media Video (WMV) represents a versatile and powerful streaming technology developed by Microsoft. As an integral component of the Windows Media framework, WMV has established itself as a reliable solution for efficiently delivering video content across networks. This format utilizes sophisticated compression algorithms that substantially reduce file sizes while maintaining acceptable visual quality, making it particularly well-suited for streaming applications where bandwidth optimization is critical. The WMV format supports an extensive range of video resolutions and bitrates, allowing developers to tailor their streaming implementations to accommodate varying network conditions and end-user requirements. This adaptability makes WMV an excellent choice for applications that need to serve diverse client environments with different connectivity constraints. ## Technical Overview of WMV Format ### Key Features and Capabilities WMV implements the Advanced Systems Format (ASF) container, which provides several technical advantages for streaming applications: - **Efficient compression**: Employs codec technology that balances quality with file size - **Scalable bitrate adjustment**: Adapts to available bandwidth conditions - **Error resilience**: Built-in mechanisms for packet loss recovery - **Content protection**: Supports Digital Rights Management (DRM) when required - **Metadata support**: Allows embedding of descriptive information about the stream ### Technical Specifications | Feature | Specification | |---------|---------------| | Codec | VC-1 (primarily) | | Container | ASF (Advanced Systems Format) | | Supported resolutions | Up to 4K UHD (depending on profile) | | Bitrate range | 10 Kbps to 20+ Mbps | | Audio support | WMA (Windows Media Audio) | | Streaming protocols | HTTP, RTSP, MMS | ## Windows-Only WMV Streaming Implementation [!badge variant="dark" size="xl" text="VideoCaptureCore"] The VisioForge SDK provides a robust framework for implementing WMV streaming in Windows environments. This implementation allows applications to broadcast video over networks while simultaneously capturing to a file if desired. ### Implementation Prerequisites Before implementing WMV streaming in your application, ensure the following requirements are met: 1. Your development environment includes the VisioForge Video Capture SDK 2. Required redistributables are installed (details provided in the Deployment section) 3. Your application targets Windows operating systems 4. Network ports are properly configured and accessible ### Step-by-Step Implementation Guide #### 1. Initialize the Video Capture Component Begin by setting up the core video capture component in your application: ```cs // Initialize the VideoCapture component var VideoCapture1 = new VisioForge.Core.VideoCapture(); // Configure basic capture settings (adjust as needed) // ... ``` #### 2. Enable Network Streaming To activate network streaming functionality, you need to enable it explicitly and set the format to WMV: ```cs // Enable network streaming VideoCapture1.Network_Streaming_Enabled = true; // Set the streaming format to WMV VideoCapture1.Network_Streaming_Format = NetworkStreamingFormat.WMV; ``` #### 3. Configure WMV Output Settings Create and configure a WMV output object with appropriate settings: ```cs // Create WMV output configuration var wmvOutput = new WMVOutput(); // Optional: Configure WMV-specific settings wmvOutput.Bitrate = 2000000; // 2 Mbps wmvOutput.KeyFrameInterval = 3; // seconds between keyframes wmvOutput.Quality = 85; // Quality setting (0-100) // Apply WMV output configuration VideoCapture1.Network_Streaming_Output = wmvOutput; // Set network port for client connections VideoCapture1.Network_Streaming_Network_Port = 12345; // Optional: Set maximum number of concurrent clients (default is 10) VideoCapture1.Network_Streaming_Max_Clients = 25; ``` #### 4. Start the Streaming Process Once everything is configured, you can start the streaming process: ```cs // Start the streaming process try { VideoCapture1.Start(); // The streaming URL is now available for clients string streamingUrl = VideoCapture1.Network_Streaming_URL; // Display or log the streaming URL for client connections Console.WriteLine($"Streaming available at: {streamingUrl}"); } catch (Exception ex) { // Handle any exceptions during streaming initialization Console.WriteLine($"Streaming error: {ex.Message}"); } ``` ### Advanced Configuration Options #### Custom WMV Profiles For more precise control over your WMV stream, you can implement custom encoding profiles: ```cs // Create custom WMV profile var customProfile = new WMVProfile(); customProfile.VideoCodec = WMVVideoCodec.WMV9; customProfile.AudioCodec = WMVAudioCodec.WMAudioV9; customProfile.VideoBitrate = 1500000; // 1.5 Mbps customProfile.AudioBitrate = 128000; // 128 Kbps customProfile.BufferWindow = 5000; // Buffer window in milliseconds // Apply custom profile wmvOutput.Profile = customProfile; VideoCapture1.Network_Streaming_Output = wmvOutput; ``` ## Client-Side Connection Implementation Clients can connect to the WMV stream using Windows Media Player or any application that supports the Windows Media streaming protocol. The connection URL follows this format: ``` http://[server_ip]:[port]/ ``` For example: ``` http://192.168.1.100:12345/ ``` ### Sample Client Connection Code For programmatic connections to the WMV stream in client applications: ```cs // Client-side WMV stream connection using Windows Media Player control using System.Windows.Forms; public partial class StreamViewerForm : Form { public StreamViewerForm(string streamUrl) { InitializeComponent(); // Assuming you have a Windows Media Player control named 'wmPlayer' on your form wmPlayer.URL = streamUrl; wmPlayer.Ctlcontrols.play(); } } ``` ## Performance Optimization When implementing WMV network streaming, consider these optimization strategies: 1. **Adjust bitrate based on network conditions**: Lower bitrates for constrained networks 2. **Balance keyframe intervals**: Frequent keyframes improve seek performance but increase bandwidth 3. **Monitor CPU usage**: WMV encoding can be CPU-intensive; adjust quality settings accordingly 4. **Implement network quality detection**: Adapt streaming parameters dynamically 5. **Consider buffer settings**: Larger buffers improve stability but increase latency ## Troubleshooting Common Issues | Issue | Possible Solution | |-------|-------------------| | Connection failures | Verify network port is open in firewall settings | | Poor video quality | Increase bitrate or adjust compression settings | | High CPU usage | Reduce resolution or frame rate | | Client buffering | Adjust buffer window settings or reduce bitrate | | Authentication errors | Verify credentials on both server and client | ## Deployment Requirements ### Required Redistributables To successfully deploy applications using WMV streaming functionality, include the following redistributable packages: - Video capture redist [x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x86/) [x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x64/) ### Installation Commands Using NuGet Package Manager: ``` Install-Package VisioForge.DotNet.Core.Redist.VideoCapture.x64 ``` Or for 32-bit systems: ``` Install-Package VisioForge.DotNet.Core.Redist.VideoCapture.x86 ``` ## Conclusion WMV network streaming provides a reliable way to broadcast video content across networks in Windows environments. The VisioForge SDK simplifies implementation with its comprehensive API while giving developers fine-grained control over streaming parameters. By following the guidelines in this document, you can create robust streaming applications that deliver high-quality video content to multiple clients simultaneously. For more advanced implementations and additional code samples, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\network-streaming\youtube.md --- title: YouTube Live Streaming Integration for .NET Apps description: Learn how to implement YouTube RTMP streaming in .NET applications with step-by-step guidance on video encoders, audio configuration, and cross-platform optimization. Includes code examples and best practices for developers. sidebar_label: YouTube Streaming --- # YouTube Live Streaming with VisioForge SDKs ## Introduction to YouTube Streaming Integration The YouTube RTMP output functionality in VisioForge SDKs enables developers to create robust .NET applications that stream high-quality video content directly to YouTube. This implementation leverages various video and audio encoders to optimize streaming performance across different hardware configurations and platforms. This comprehensive guide provides detailed instructions on setting up, configuring, and troubleshooting YouTube streaming in your applications. ## Supported SDK Platforms [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] All major VisioForge SDK platforms provide cross-platform capabilities for YouTube streaming, ensuring consistent functionality across Windows, macOS, and other operating systems. ## Understanding the YouTubeOutput Class The `YouTubeOutput` class serves as the primary interface for YouTube streaming configuration, offering extensive customization options including: - **Video encoder selection and configuration**: Choose from multiple hardware-accelerated and software-based encoders - **Audio encoder selection and configuration**: Configure AAC audio encoders with custom parameters - **Custom video and audio processing**: Apply filters and transformations before streaming - **YouTube-specific sink settings**: Fine-tune streaming parameters specific to YouTube's requirements ## Getting Started: Basic Setup Process ### Stream Key Configuration The foundation of any YouTube streaming implementation begins with your YouTube stream key. This authentication token connects your application to your YouTube channel: ```csharp // Initialize YouTube output with your stream key var youtubeOutput = new YouTubeOutput("your-youtube-stream-key"); ``` ## Video Encoder Configuration Options ### Comprehensive Video Encoder Support The SDK provides support for multiple video encoders, each optimized for different hardware environments and performance requirements: | Encoder Type | Platform/Hardware | Performance Characteristics | |--------------|-------------------|----------------------------| | OpenH264 | Cross-platform (software) | CPU-intensive, widely compatible | | NVENC H264 | NVIDIA GPUs | Hardware-accelerated, reduced CPU usage | | QSV H264 | Intel CPUs with Quick Sync | Hardware-accelerated, efficient | | AMF H264 | AMD GPUs | Hardware-accelerated for AMD hardware | | HEVC/H265 | Various (where supported) | Higher compression efficiency | ### Dynamic Encoder Selection The system intelligently selects default encoders based on the platform (OpenH264 on most platforms, Apple Media H264 on macOS). Developers can override these defaults to leverage specific hardware capabilities: ```csharp // Example: Using NVIDIA NVENC encoder if available if (NVENCH264EncoderSettings.IsAvailable()) { youtubeOutput.Video = new NVENCH264EncoderSettings(); } ``` ### Configuring Video Encoding Parameters Each encoder supports customization of various parameters to optimize streaming quality and performance: ```csharp var videoSettings = new OpenH264EncoderSettings { Bitrate = 4500000, // 4.5 Mbps KeyframeInterval = 60, // Keyframe every 2 seconds at 30fps // Add other encoder-specific settings as needed }; youtubeOutput.Video = videoSettings; ``` ## Audio Encoder Configuration ### Supported AAC Audio Encoders The SDK supports multiple AAC audio encoders to ensure optimal audio quality across different platforms: - **VO-AAC**: Default for non-Windows platforms, providing consistent audio encoding - **AVENC AAC**: Alternative cross-platform option with different performance characteristics - **MF AAC**: Windows-specific encoder leveraging Media Foundation ### Audio Encoder Configuration Example ```csharp // Example: Configure audio encoder settings var audioSettings = new VOAACEncoderSettings { Bitrate = 128000, // 128 kbps SampleRate = 48000 // 48 kHz (YouTube recommended) }; youtubeOutput.Audio = audioSettings; ``` ## Platform-Specific Optimization Strategies ### Windows-Specific Features - Leverages Media Foundation (MF) encoders for optimal Windows performance - Provides extended HEVC/H265 encoding capabilities - Defaults to MF AAC for audio encoding, optimized for the Windows platform ### macOS Implementation Considerations - Automatically utilizes Apple Media H264 encoder for native performance - Implements VO-AAC for audio encoding with macOS optimization ### Cross-Platform Compatibility Layer - Falls back to OpenH264 for video on platforms without specific optimizations - Utilizes VO-AAC for consistent audio encoding across diverse environments ## Best Practices for Optimal Streaming ### Hardware-Aware Encoder Selection - Always verify encoder availability before implementing hardware-accelerated options - Implement fallback mechanisms to OpenH264 when specialized hardware is unavailable - Consider platform-specific encoder capabilities when designing cross-platform applications ### YouTube-Optimized Stream Settings - Adhere to YouTube's recommended bitrates for your target resolution - Implement the standard 2-second keyframe interval (60 frames at 30fps) - Configure 48 kHz audio sample rate to meet YouTube's audio specifications ### Robust Error Management - Develop comprehensive error handling for connection issues - Implement continuous monitoring of encoder performance - Create diagnostic tools to evaluate stream health during operation ## Complete Implementation Examples ### VideoCaptureCoreX/VideoEditCoreX Integration This example demonstrates a complete YouTube streaming implementation with error handling for VideoCaptureCoreX/VideoEditCoreX: ```csharp try { var youtubeOutput = new YouTubeOutput("your-stream-key"); // Configure video encoding if (NVENCH264EncoderSettings.IsAvailable()) { youtubeOutput.Video = new NVENCH264EncoderSettings { Bitrate = 4500000, KeyframeInterval = 60 }; } // Configure audio encoding youtubeOutput.Audio = new MFAACEncoderSettings { Bitrate = 128000, SampleRate = 48000 }; // Additional sink settings if needed youtubeOutput.Sink.CustomProperty = "value"; // Add the output to the video capture instance core.Outputs_Add(youtubeOutput, true); // core is an instance of VideoCaptureCoreX // Or set the output for the video edit instance videoEdit.Output_Format = youtubeOutput; // videoEdit is an instance of VideoEditCoreX } catch (Exception ex) { // Handle initialization errors Console.WriteLine($"Failed to initialize YouTube output: {ex.Message}"); } ``` ### Media Blocks SDK Implementation For developers using the Media Blocks SDK, this example shows how to connect encoder components with the YouTube sink: ```csharp // Create the YouTube sink block (using RTMP) var youtubeSinkBlock = YouTubeSinkBlock(new YouTubeSinkSettings("streaming key")); // Connect the video encoder to the sink block pipeline.Connect(h264Encoder.Output, youtubeSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); // Connect the audio encoder to the sink block pipeline.Connect(aacEncoder.Output, youtubeSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); ``` ## Troubleshooting Common Issues ### Encoder Initialization Problems - Verify hardware encoder availability through system diagnostics - Ensure system meets all requirements for your chosen encoder - Confirm proper installation of hardware-specific drivers for GPU acceleration ### Stream Connection Failures - Validate stream key format and expiration status - Test network connectivity to YouTube's streaming servers - Verify YouTube service status through official channels ### Performance Optimization - Monitor system resource utilization during streaming sessions - Adjust encoding bitrates and settings based on available resources - Consider switching to hardware acceleration when CPU usage is excessive ## Additional Resources and Documentation - [Official YouTube Live Streaming Documentation](https://support.google.com/youtube/topic/9257891) - [YouTube Technical Stream Requirements](https://support.google.com/youtube/answer/2853702) By leveraging these detailed configuration options and best practices, developers can create robust YouTube streaming applications using VisioForge SDKs that deliver high-quality content while optimizing system resource utilization across multiple platforms. ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\avi.md --- title: AVI File Output Guide for .NET SDK Development description: Learn how to implement AVI file output in .NET applications with step-by-step examples. Covers video and audio encoding options, hardware acceleration, cross-platform support, and best practices for developers working with multimedia container formats. sidebar_label: AVI --- # AVI File Output in VisioForge .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) AVI (Audio Video Interleave) is a Microsoft-developed multimedia container format that stores both audio and video data in a single file with synchronized playback. It supports both compressed and uncompressed data, offering flexibility while sometimes resulting in larger file sizes. ## Technical Overview of AVI Format AVI files use a RIFF (Resource Interchange File Format) structure to organize data. This format divides content into chunks, with each chunk containing either audio or video frames. Key technical aspects include: - Container format supporting multiple audio and video codecs - Interleaved audio and video data for synchronized playback - Maximum file size of 4GB in standard AVI (extended to 16EB in OpenDML AVI) - Support for multiple audio tracks and subtitles - Widely supported across platforms and media players Despite newer container formats like MP4 and MKV offering more features, AVI remains valuable for certain workflows due to its simplicity and compatibility with legacy systems. ## Cross-Platform AVI Implementation [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The [AVIOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.AVIOutput.html) class provides a robust way to configure and generate AVI files with various encoding options. ### Setting Up AVI Output Create an `AVIOutput` instance by specifying a target filename: ```csharp var aviOutput = new AVIOutput("output_video.avi"); ``` This constructor automatically initializes default encoders: - Video: OpenH264 encoder - Audio: MP3 encoder ### Video Encoder Options Configure video encoding through the `Video` property with several available encoders: #### Standard Encoder ```csharp // Open-source H.264 encoder for general use aviOutput.Video = new OpenH264EncoderSettings(); ``` #### Hardware-Accelerated Encoders ```csharp // NVIDIA GPU acceleration aviOutput.Video = new NVENCH264EncoderSettings(); // H.264 aviOutput.Video = new NVENCHEVCEncoderSettings(); // HEVC // Intel Quick Sync acceleration aviOutput.Video = new QSVH264EncoderSettings(); // H.264 aviOutput.Video = new QSVHEVCEncoderSettings(); // HEVC // AMD GPU acceleration aviOutput.Video = new AMFH264EncoderSettings(); // H.264 aviOutput.Video = new AMFHEVCEncoderSettings(); // HEVC ``` #### Special Purpose Encoder ```csharp // Motion JPEG for high-quality frame-by-frame encoding aviOutput.Video = new MJPEGEncoderSettings(); ``` ### Audio Encoder Options The `Audio` property lets you configure audio encoding settings: ```csharp // Standard MP3 encoding aviOutput.Audio = new MP3EncoderSettings(); // AAC encoding options aviOutput.Audio = new VOAACEncoderSettings(); aviOutput.Audio = new AVENCAACEncoderSettings(); aviOutput.Audio = new MFAACEncoderSettings(); // Windows only ``` ### Integration with SDK Components #### Video Capture SDK ```csharp var core = new VideoCaptureCoreX(); core.Outputs_Add(aviOutput, true); ``` #### Video Edit SDK ```csharp var core = new VideoEditCoreX(); core.Output_Format = aviOutput; ``` #### Media Blocks SDK ```csharp var aac = new VOAACEncoderSettings(); var h264 = new OpenH264EncoderSettings(); var aviSinkSettings = new AVISinkSettings("output.avi"); var aviOutput = new AVIOutputBlock(aviSinkSettings, h264, aac); ``` ### File Management You can get or change the output filename after initialization: ```csharp // Get current filename string currentFile = aviOutput.GetFilename(); // Set new filename aviOutput.SetFilename("new_output.avi"); ``` ### Complete Example Here's a full example showing how to configure AVI output with hardware acceleration: ```csharp // Create AVI output with specified filename var aviOutput = new AVIOutput("high_quality_output.avi"); // Configure hardware-accelerated NVIDIA H.264 encoding aviOutput.Video = new NVENCH264EncoderSettings(); // Configure AAC audio encoding aviOutput.Audio = new VOAACEncoderSettings(); ``` ## Windows-Specific AVI Implementation [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The Windows-only components provide additional options for AVI output configuration. ### Basic Setup Create the AVIOutput object: ```csharp var aviOutput = new AVIOutput(); ``` ### Configuration Methods #### Method 1: Using Settings Dialog ```csharp var aviSettingsDialog = new AVISettingsDialog( VideoCapture1.Video_Codecs.ToArray(), VideoCapture1.Audio_Codecs.ToArray()); aviSettingsDialog.ShowDialog(this); aviSettingsDialog.SaveSettings(ref aviOutput); ``` #### Method 2: Programmatic Configuration First, get available codecs: ```csharp // Populate codec lists foreach (string codec in VideoCapture1.Video_Codecs) { cbVideoCodecs.Items.Add(codec); } foreach (string codec in VideoCapture1.Audio_Codecs) { cbAudioCodecs.Items.Add(codec); } ``` Then set video and audio settings: ```csharp // Configure video aviOutput.Video_Codec = cbVideoCodecs.Text; // Configure audio aviOutput.ACM.Name = cbAudioCodecs.Text; aviOutput.ACM.Channels = 2; aviOutput.ACM.BPS = 16; aviOutput.ACM.SampleRate = 44100; aviOutput.ACM.UseCompression = true; ``` ### Implementation Apply settings and start capture: ```csharp // Set output format VideoCapture1.Output_Format = aviOutput; // Set capture mode VideoCapture1.Mode = VideoCaptureMode.VideoCapture; // Set output file path VideoCapture1.Output_Filename = "output.avi"; // Start capture await VideoCapture1.StartAsync(); ``` ## Best Practices for AVI Output ### Encoder Selection Guidelines 1. **General-Purpose Applications** - OpenH264 provides good compatibility and quality - Suitable for most standard development scenarios 2. **Performance-Critical Applications** - Use hardware-accelerated encoders (NVENC, QSV, AMF) when available - Offers significant performance advantages with minimal quality loss 3. **Quality-Focused Applications** - HEVC encoders provide better compression at similar quality - MJPEG for scenarios requiring frame-by-frame accuracy ### Audio Encoding Recommendations - MP3: Good compatibility with reasonable quality - AAC: Better quality-to-size ratio, preferred for newer applications - Choose based on your target platform and quality requirements ### Platform Considerations - Some encoders are platform-specific: - MF HEVC and MF AAC encoders are Windows-only - Hardware-accelerated encoders require appropriate GPU support - Check encoder availability with `GetVideoEncoders()` and `GetAudioEncoders()` when developing cross-platform applications ### Error Handling Tips - Always verify encoder availability before use - Implement fallback encoders for platform-specific scenarios - Check file write permissions before setting output paths ## Troubleshooting Common Issues ### Codec Not Found If you encounter "Codec not found" errors: ```csharp // Check if codec is available before using if (!VideoCapture1.Video_Codecs.Contains("H264")) { // Fall back to another codec or show error MessageBox.Show("H264 codec not available. Please install required codecs."); return; } ``` ### File Write Permission Issues Handle permission-related errors: ```csharp try { // Test write permissions using (var fs = File.Create(outputPath, 1, FileOptions.DeleteOnClose)) { } // If successful, proceed with AVI output aviOutput.SetFilename(outputPath); } catch (UnauthorizedAccessException) { // Handle permission error MessageBox.Show("Cannot write to the specified location. Please select another folder."); } ``` ### Memory Issues with Large Files For handling large file recording: ```csharp // Split recording into multiple files when size limit is reached void SetupLargeFileRecording() { var aviOutput = new AVIOutput("recording_part1.avi"); // Set file size limit (3.5GB to stay under 4GB AVI limit) aviOutput.MaxFileSize = 3.5 * 1024 * 1024 * 1024; // Enable auto-split functionality aviOutput.AutoSplit = true; aviOutput.SplitNamingPattern = "recording_part{0}.avi"; // Apply to Video Capture var core = new VideoCaptureCoreX(); core.Outputs_Add(aviOutput, true); } ``` ## Required Dependencies ### Video Capture SDK .Net - [x86 Redist](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x86/) - [x64 Redist](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x64/) ### Video Edit SDK .Net - [x86 Redist](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoEdit.x86/) - [x64 Redist](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoEdit.x64/) ## Additional Resources - [VisioForge API Documentation](https://api.visioforge.org/dotnet/) - [Sample Projects Repository](https://github.com/visioforge/.Net-SDK-s-samples) - [Support and Community Forums](https://support.visioforge.com/) ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\custom.md --- title: DirectShow Custom Video Format Integration in .NET description: Learn how to implement custom video output formats using DirectShow filters in .NET applications. Step-by-step guide for developers to create specialized video processing pipelines with codec configuration and format handling. sidebar_label: Custom Output Formats --- # Creating Custom Video Output Formats with DirectShow Filters [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] ## Overview Working with video in .NET applications often requires custom output formats to meet specific project requirements. The VisioForge SDKs provide powerful capabilities to implement custom format outputs using DirectShow filters, giving developers precise control over audio and video processing pipelines. This guide demonstrates practical techniques for implementing custom output formats that work seamlessly with both the Video Capture SDK .NET and Video Edit SDK .NET, allowing you to tailor your video applications to exact specifications. ## Why Use Custom Output Formats? Custom output formats offer several advantages for .NET developers: - Support for specialized video codecs not available in standard formats - Fine-grained control over video and audio compression settings - Integration with third-party DirectShow filters - Ability to create proprietary or industry-specific output formats - Optimization for specific use cases (streaming, archiving, editing) ## Getting Started with CustomOutput The `CustomOutput` class is the cornerstone for configuring custom output settings in VisioForge SDKs. This class enables you to define and configure the filters used in your video processing pipeline. Start by initializing a new instance: ```cs var customOutput = new CustomOutput(); ``` While our examples use the `VideoCaptureCore` class, developers using Video Edit SDK .NET can apply the same techniques with `VideoEditCore`. ## Implementation Strategies There are two primary approaches to implementing custom format output with DirectShow filters: ### Strategy 1: Three-Component Pipeline This modular approach divides the processing pipeline into three distinct components: 1. Audio codec 2. Video codec 3. Multiplexer (file format container) This separation provides maximum flexibility and control over each stage of the process. You can use either standard DirectShow filters or specialized codecs for audio and video components. #### Retrieving Available Codecs Begin by populating your UI with available codecs and filters: ```cs // Populate video codec options foreach (string codec in VideoCapture1.Video_Codecs) { videoCodecDropdown.Items.Add(codec); } // Populate audio codec options foreach (string codec in VideoCapture1.Audio_Codecs) { audioCodecDropdown.Items.Add(codec); } // Get all available DirectShow filters foreach (string filter in VideoCapture1.DirectShow_Filters) { directShowAudioFilters.Items.Add(filter); directShowVideoFilters.Items.Add(filter); multiplexerFilters.Items.Add(filter); fileWriterFilters.Items.Add(filter); } ``` #### Configuring the Pipeline Components Next, set up your video and audio processing components based on user selections: ```cs // Set up video codec if (useStandardVideoCodec.Checked) { customOutput.Video_Codec = videoCodecDropdown.Text; customOutput.Video_Codec_UseFiltersCategory = false; } else { customOutput.Video_Codec = directShowVideoFilters.Text; customOutput.Video_Codec_UseFiltersCategory = true; } // Set up audio codec if (useStandardAudioCodec.Checked) { customOutput.Audio_Codec = audioCodecDropdown.Text; customOutput.Audio_Codec_UseFiltersCategory = false; } else { customOutput.Audio_Codec = directShowAudioFilters.Text; customOutput.Audio_Codec_UseFiltersCategory = true; } // Configure the multiplexer customOutput.MuxFilter_Name = multiplexerFilters.Text; customOutput.MuxFilter_IsEncoder = false; ``` #### Custom File Writer Configuration For specialized outputs that require a dedicated file writer: ```cs // Enable special file writer if needed customOutput.SpecialFileWriter_Needed = useCustomFileWriter.Checked; customOutput.SpecialFileWriter_FilterName = fileWriterFilters.Text; ``` This approach gives you granular control over each stage of the encoding process, making it ideal for complex output requirements. ### Strategy 2: All-in-One Filter This streamlined approach uses a single DirectShow filter that combines the functionality of the multiplexer, video codec, and audio codec. The SDK intelligently handles detection of the filter's capabilities, determining whether it: - Can directly write files without assistance - Requires the standard DirectShow File Writer filter - Needs a specialized file writer filter Implementation is straightforward: ```cs // Populate filter options from available DirectShow filters foreach (string filter in VideoCapture1.DirectShow_Filters) { filterDropdown.Items.Add(filter); } // Configure the all-in-one filter customOutput.MuxFilter_Name = selectedFilter.Text; customOutput.MuxFilter_IsEncoder = true; // Set up specialized file writer if required customOutput.SpecialFileWriter_Needed = requiresCustomWriter.Checked; customOutput.SpecialFileWriter_FilterName = fileWriterFilter.Text; ``` This approach is simpler to implement but offers less granular control over individual components of the encoding process. ## Simplifying Configuration with Dialog UI For a more user-friendly implementation, VisioForge provides a built-in settings dialog that handles the configuration of custom formats: ```cs // Create and configure the settings dialog CustomFormatSettingsDialog settingsDialog = new CustomFormatSettingsDialog( VideoCapture1.Video_Codecs.ToArray(), VideoCapture1.Audio_Codecs.ToArray(), VideoCapture1.DirectShow_Filters.ToArray()); // Apply settings to your CustomOutput instance settingsDialog.SaveSettings(ref customOutput); ``` This dialog provides a complete UI for configuring all aspects of custom output formats, saving development time while still offering full flexibility. ## Implementing the Output Process After configuring your custom format settings, you need to apply them to your capture or edit process: ```cs // Apply the custom format configuration VideoCapture1.Output_Format = customOutput; // Set the capture mode VideoCapture1.Mode = VideoCaptureMode.VideoCapture; // Specify output file path VideoCapture1.Output_Filename = "output_video.mp4"; // Start the capture or encoding process await VideoCapture1.StartAsync(); ``` ## Performance Considerations When implementing custom output formats, keep these performance tips in mind: - DirectShow filters vary in efficiency and resource usage - Test your filter combinations with typical input media - Some third-party filters may introduce additional latency - Consider memory usage when processing high-resolution video - Filter compatibility may vary across different Windows versions ## Required Packages To use custom DirectShow filters, ensure you have the appropriate redistributable packages installed: ### Video Capture SDK .Net - [x86 Package](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x86/) - [x64 Package](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x64/) ### Video Edit SDK .Net - [x86 Package](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoEdit.x86/) - [x64 Package](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoEdit.x64/) ## Troubleshooting Common issues when working with custom DirectShow filters include: - Filter compatibility conflicts - Missing codecs or dependencies - Registration issues with COM components - Memory leaks in third-party filters - Performance bottlenecks with complex filter graphs If you encounter problems, verify that all required filters are properly registered on your system and that you have the latest versions of both the filters and the VisioForge SDK. ## Conclusion Custom output formats using DirectShow filters provide powerful capabilities for .NET developers working with video applications. Whether you choose the flexibility of a three-component pipeline or the simplicity of an all-in-one filter approach, VisioForge's SDKs give you the tools you need to create exactly the output format your application requires. --- For more code samples and implementation examples, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\ffmpeg-exe.md --- title: FFMPEG Integration for VisioForge Video SDKs description: Implement powerful FFMPEG.exe output in VisioForge .Net SDKs for video capture, editing, and processing. Learn how to configure video codecs, hardware acceleration, custom encoding parameters, and optimize performance for professional video applications. sidebar_label: FFMPEG (exe) --- # FFMPEG.exe Integration with VisioForge .Net SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] ## Introduction to FFMPEG Output in .NET This guide provides detailed instructions for implementing FFMPEG.exe output in Windows applications using VisioForge's .NET SDKs. The integration works with both [Video Capture SDK .NET](https://www.visioforge.com/video-capture-sdk-net) and [Video Edit SDK .NET](https://www.visioforge.com/video-edit-sdk-net), utilizing the `VideoCaptureCore` and `VideoEditCore` engines. FFMPEG functions as a powerful multimedia framework that enables developers to output to a wide variety of video and audio formats. Its flexibility stems from extensive codec support and granular control over encoding parameters for both video and audio streams. ## Why Use FFMPEG with VisioForge SDKs? Integrating FFMPEG into your VisioForge-powered applications provides several technical advantages: - **Format versatility**: Support for virtually all modern container formats - **Codec flexibility**: Access to both open-source and proprietary codecs - **Performance optimization**: Options for CPU and GPU acceleration - **Customization depth**: Fine-grained control over encoding parameters - **Cross-platform compatibility**: Consistent output on different systems ## Key Features and Capabilities ### Supported Output Formats FFMPEG supports numerous container formats, including but not limited to: - MP4 (MPEG-4 Part 14) - WebM (VP8/VP9 with Vorbis/Opus) - MKV (Matroska) - AVI (Audio Video Interleave) - MOV (QuickTime) - WMV (Windows Media Video) - FLV (Flash Video) - TS (MPEG Transport Stream) ### Hardware Acceleration Options Modern video encoding benefits from hardware acceleration technologies that significantly improve encoding speed and efficiency: - **Intel QuickSync**: Leverages Intel integrated graphics for H.264 and HEVC encoding - **NVIDIA NVENC**: Utilizes NVIDIA GPUs for accelerated encoding (requires compatible NVIDIA graphics card) - **AMD AMF/VCE**: Employs AMD graphics processors for encoding acceleration ### Video Codec Support The integration offers access to multiple video codecs with customizable parameters: - **H.264/AVC**: Industry standard with excellent quality-to-size ratio - **H.265/HEVC**: Higher efficiency codec for 4K+ content - **VP9**: Google's open video codec used in WebM - **AV1**: Next-generation open codec (where supported) - **MPEG-2**: Legacy codec for DVD and broadcast compatibility - **ProRes**: Professional codec for editing workflows ## Implementation Process ### 1. Setting Up Your Development Environment Before implementing FFMPEG output, ensure your development environment is properly configured: 1. Create a new or open an existing .NET project 2. Install the appropriate VisioForge SDK NuGet packages 3. Add FFMPEG dependency packages (detailed in the Dependencies section) 4. Import the necessary namespaces in your code: ```csharp using VisioForge.Core.Types; using VisioForge.Core.Types.VideoCapture; using VisioForge.Core.Types.VideoEdit; ``` ### 2. Initializing FFMPEG Output Start by creating an instance of `FFMPEGEXEOutput` to handle your output configuration: ```csharp var ffmpegOutput = new FFMPEGEXEOutput(); ``` This object will serve as the container for all your encoding settings and preferences. ### 3. Configuring Output Container Format Set your desired output container format using the `OutputMuxer` property: ```csharp ffmpegOutput.OutputMuxer = OutputMuxer.MP4; ``` Other common container options include: - `OutputMuxer.MKV` - For Matroska container - `OutputMuxer.WebM` - For WebM format - `OutputMuxer.AVI` - For AVI format - `OutputMuxer.MOV` - For QuickTime container ### 4. Video Encoder Configuration FFMPEG provides multiple video encoder options. Select and configure the appropriate encoder based on your requirements and available hardware: #### Standard CPU-Based H.264 Encoding ```csharp var videoEncoder = new H264MFSettings { Bitrate = 5000000, RateControlMode = RateControlMode.CBR }; ffmpegOutput.Video = videoEncoder; ``` #### Hardware-Accelerated NVIDIA Encoding ```csharp var nvidiaEncoder = new H264NVENCSettings { Bitrate = 8000000, // 8 Mbps }; ffmpegOutput.Video = nvidiaEncoder; ``` #### Hardware-Accelerated Intel QuickSync Encoding ```csharp var intelEncoder = new H264QSVSettings { Bitrate = 6000000 }; ffmpegOutput.Video = intelEncoder; ``` #### HEVC/H.265 Encoding for Higher Efficiency ```csharp var hevcEncoder = new HEVCQSVSettings { Bitrate = 3000000, }; ffmpegOutput.Video = hevcEncoder; ``` ### 5. Audio Encoder Configuration Configure your audio encoding settings based on quality requirements and target platform compatibility: ```csharp var audioEncoder = new BasicAudioSettings { Bitrate = 192000, // 192 kbps Channels = 2, // Stereo SampleRate = 48000, // 48 kHz - professional standard Encoder = AudioEncoder.AAC, Mode = AudioMode.CBR }; ffmpegOutput.Audio = audioEncoder; ``` ### 6. Final Configuration and Execution Apply all settings and start the encoding process: ```csharp // Apply format settings core.Output_Format = ffmpegOutput; // Set operation mode core.Mode = VideoCaptureMode.VideoCapture; // For Video Capture SDK // core.Mode = VideoEditMode.Convert; // For Video Edit SDK // Set output path core.Output_Filename = "output.mp4"; // Begin processing await core.StartAsync(); ``` ## Required Dependencies Install the following NuGet packages based on your target architecture to ensure proper functionality: ### Video Capture SDK Dependencies ```cmd Install-Package VisioForge.DotNet.Core.Redist.VideoCapture.x64 Install-Package VisioForge.DotNet.Core.Redist.FFMPEGEXE.x64 ``` For x86 targets: ```cmd Install-Package VisioForge.DotNet.Core.Redist.VideoCapture.x86 Install-Package VisioForge.DotNet.Core.Redist.FFMPEGEXE.x86 ``` ### Video Edit SDK Dependencies ```cmd Install-Package VisioForge.DotNet.Core.Redist.VideoEdit.x64 Install-Package VisioForge.DotNet.Core.Redist.FFMPEGEXE.x64 ``` For x86 targets: ```cmd Install-Package VisioForge.DotNet.Core.Redist.VideoEdit.x86 Install-Package VisioForge.DotNet.Core.Redist.FFMPEGEXE.x86 ``` ## Troubleshooting and Optimization ### Common Issues and Solutions - **Codec not found errors**: Ensure you've installed the correct FFMPEG package with proper codec support - **Hardware acceleration failures**: Verify GPU compatibility and driver versions - **Performance issues**: Adjust thread count and encoding preset based on available CPU resources - **Output quality problems**: Fine-tune bitrate, profile, and encoding parameters ### Performance Optimization Tips - Use hardware acceleration when available - Choose appropriate presets based on your quality/speed requirements - Set reasonable bitrates based on content type and resolution - Consider two-pass encoding for non-realtime scenarios requiring highest quality ## Additional Resources For more code samples and implementation examples, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). To learn more about FFMPEG parameters and capabilities, refer to the [official FFMPEG documentation](https://ffmpeg.org/documentation.html). ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\gif.md --- title: GIF Animation Encoding for .NET Development description: Learn how to implement and optimize GIF animation encoding in .NET applications. Explore frame rate control, resolution settings, and performance tuning with detailed code examples for both cross-platform and Windows environments. sidebar_label: GIF --- # GIF Encoder [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The GIF encoder is a component of the VisioForge SDK that enables video encoding to the GIF format. This document provides detailed information about the GIF encoder settings and implementation guidelines. ## Cross-platform GIF output [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The GIF encoder settings are managed through the `GIFEncoderSettings` class, which provides configuration options for controlling the encoding process. ### Properties 1. **Repeat** - Type: `uint` - Description: Controls the number of times the GIF animation will repeat - Values: - `-1`: Loop forever - `0..n`: Finite number of repetitions 2. **Speed** - Type: `int` - Description: Controls the encoding speed - Range: 1 to 30 (higher values result in faster encoding) - Default: 10 ## Implementation Guide ### Basic Usage Here's a basic example of how to configure and use the GIF encoder: ```csharp using VisioForge.Core.Types.X.VideoEncoders; // Create and configure GIF encoder settings var settings = new GIFEncoderSettings { Repeat = 0, // Play once Speed = 15 // Set encoding speed to 15 }; ``` ### Advanced Configuration For more controlled GIF encoding, you can adjust the settings based on your specific needs: ```csharp // Configure for an infinitely looping GIF with maximum encoding speed var settings = new GIFEncoderSettings { Repeat = uint.MaxValue, // Loop forever Speed = 30 // Maximum encoding speed }; // Configure for optimal quality var qualitySettings = new GIFEncoderSettings { Repeat = 1, // Play twice Speed = 1 // Slowest encoding speed for best quality }; ``` ## Best Practices 1. **Speed Selection** - For best quality, use lower speed values (1-5) - For balanced quality and performance, use medium speed values (6-15) - For fastest encoding, use higher speed values (16-30) 2. **Memory Considerations** - Higher speed values consume more memory during encoding - For large videos, consider using lower speed values to manage memory usage 3. **Loop Configuration** - Use `Repeat = -1` for infinite loops - Set specific repeat counts for presentation-style GIFs - Use `Repeat = 0` for single-play GIFs ## Performance Optimization When encoding videos to GIF format, consider these optimization strategies: ```csharp // Optimize for web delivery var webOptimizedSettings = new GIFEncoderSettings { Repeat = uint.MaxValue, // Infinite loop for web playback Speed = 20 // Fast encoding for web content }; // Optimize for quality var qualityOptimizedSettings = new GIFEncoderSettings { Repeat = 1, // Single repeat Speed = 3 // Slower encoding for better quality }; ``` ### Example Implementation Here's a complete example showing how to set up M4A output: Add the M4A output to the Video Capture SDK core instance: ```csharp var core = new VideoCaptureCoreX(); core.Outputs_Add(gifOutput, true); ``` Set the output format for the Video Edit SDK core instance: ```csharp var core = new VideoEditCoreX(); core.Output_Format = gifOutput; ``` Create a Media Blocks GIF output instance: ```csharp var gifSettings = new GIFEncoderSettings(); var gifOutput = new GIFEncoderBlock(gifSettings, "output.gif"); ``` ## Windows-only GIF output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The `AnimatedGIFOutput` class is a specialized configuration class within the `VisioForge.Core.Types.Output` namespace that handles settings for generating animated GIF files. This class is designed to work with both video capture and video editing operations, implementing both `IVideoEditBaseOutput` and `IVideoCaptureBaseOutput` interfaces. The primary purpose of this class is to provide a configuration container for controlling how video content is converted into animated GIF format. It allows users to specify key parameters such as frame rate and output dimensions, which are crucial for creating optimized animated GIFs from video sources. ### Properties #### ForcedVideoHeight - Type: `int` - Purpose: Specifies a forced height for the output GIF - Usage: Set this property when you need to resize the output GIF to a specific height, regardless of the input video dimensions - Example: `gifOutput.ForcedVideoHeight = 480;` #### ForcedVideoWidth - Type: `int` - Purpose: Specifies a forced width for the output GIF - Usage: Set this property when you need to resize the output GIF to a specific width, regardless of the input video dimensions - Example: `gifOutput.ForcedVideoWidth = 640;` #### FrameRate - Type: `VideoFrameRate` - Default Value: 2 frames per second - Purpose: Controls how many frames per second the output GIF will contain - Note: The default value of 2 fps is chosen to balance file size and animation smoothness for typical GIF usage ### Constructor ```csharp public AnimatedGIFOutput() ``` The constructor initializes a new instance with default settings: - Sets the frame rate to 2 fps using `new VideoFrameRate(2)` - All other properties are initialized to their default values ### Serialization Methods #### Save() - Returns: `string` - Purpose: Serializes the current configuration to JSON format - Usage: Use this method when you need to save or transfer the configuration - Example: ```csharp var gifOutput = new AnimatedGIFOutput(); gifOutput.ForcedVideoWidth = 800; string jsonConfig = gifOutput.Save(); ``` #### Load(string json) - Parameters: `json` - A JSON string containing serialized configuration - Returns: `AnimatedGIFOutput` - Purpose: Creates a new instance from a JSON configuration string - Usage: Use this method to restore a previously saved configuration - Example: ```csharp string jsonConfig = "..."; // Your saved JSON configuration var gifOutput = AnimatedGIFOutput.Load(jsonConfig); ``` ### Best Practices and Usage Guidelines 1. Frame Rate Considerations - The default 2 fps is suitable for most basic animations - Increase the frame rate for smoother animations, but be aware of file size implications - Consider using higher frame rates (e.g., 10-15 fps) for complex motion 2. Resolution Settings - Only set ForcedVideoWidth/Height when you specifically need to resize - Maintain aspect ratio by setting width and height proportionally - Consider target platform limitations when choosing dimensions 3. Performance Optimization - Lower frame rates result in smaller file sizes - Consider the balance between quality and file size based on your use case - Test different configurations to find the optimal settings for your needs ### Example Usage Here's a complete example of configuring and using the AnimatedGIFOutput class: ```csharp // Create a new instance with default settings var gifOutput = new AnimatedGIFOutput(); // Configure the output settings gifOutput.ForcedVideoWidth = 800; gifOutput.ForcedVideoHeight = 600; gifOutput.FrameRate = new VideoFrameRate(5); // 5 fps // Apply the settings to the core core.Output_Format = gifOutput; // core is an instance of VideoCaptureCore or VideoEditCore core.Output_Filename = "output.gif"; ``` ### Common Scenarios and Solutions #### Creating Web-Optimized GIFs ```csharp var webGifOutput = new AnimatedGIFOutput { ForcedVideoWidth = 480, ForcedVideoHeight = 270, FrameRate = new VideoFrameRate(5) }; ``` #### High-Quality Animation Settings ```csharp var highQualityGif = new AnimatedGIFOutput { FrameRate = new VideoFrameRate(15) }; ``` ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\index.md --- title: Video & Audio Format Guide for .NET Development description: Learn about video and audio formats for .NET applications - from MP4 and WebM to AVI and MKV. Includes practical implementation examples, codec comparisons, and a detailed compatibility matrix for developers. sidebar_label: Output Formats order: 17 --- # Output Formats for .NET Media SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction The VisioForge .NET SDKs support a wide range of output formats for video, audio, and media projects. Selecting the right format is crucial for ensuring compatibility, optimizing file size, and maintaining quality appropriate for your target platform. This guide covers all available formats, their technical specifications, use cases, and implementation details to help developers make informed decisions. ## Choosing the Right Format When selecting an output format, consider these key factors: - **Target platform** - Some formats work better on specific devices or browsers - **Quality requirements** - Different codecs provide varying levels of quality at different bitrates - **File size constraints** - Some formats offer better compression than others - **Processing overhead** - Encoding complexity varies between formats - **Streaming requirements** - Certain formats are optimized for streaming scenarios ## Video Container Formats ### AVI (Audio Video Interleave) [AVI](avi.md) is a classic container format developed by Microsoft that supports various video and audio codecs. **Key features:** - Wide compatibility with Windows applications - Supports virtually any DirectShow-compatible video and audio codec - Simple structure makes it reliable for video editing workflows - Better suited for archiving than streaming ### MP4 (MPEG-4 Part 14) [MP4](mp4.md) is one of the most versatile and widely used container formats in modern applications. **Key features:** - Excellent compatibility across devices and platforms - Supports advanced codecs including H.264, H.265/HEVC, and AAC - Optimized for streaming and progressive download - Efficient storage with good quality-to-size ratio **Supported video codecs:** - H.264 (AVC) - Balance of quality and compatibility - H.265 (HEVC) - Better compression but higher encoding overhead - MPEG-4 Part 2 - Older codec with wider compatibility **Supported audio codecs:** - AAC - Industry standard for digital audio compression - MP3 - Widely supported legacy format ### WebM [WebM](webm.md) is an open-source container format designed specifically for web use. **Key features:** - Royalty-free format ideal for web applications - Native support in most modern browsers - Excellent for streaming video content - Supports VP8, VP9, and AV1 video codecs **Technical considerations:** - VP9 offers ~50% bitrate reduction compared to H.264 at similar quality - AV1 provides even better compression but with significantly higher encoding complexity - Works well with HTML5 video elements without plugins ### MKV (Matroska) [MKV](mkv.md) is a flexible container format that can hold virtually any type of audio or video. **Key features:** - Supports multiple audio, video, and subtitle tracks - Can contain almost any codec - Great for archiving and high-quality storage - Supports chapters and attachments **Best uses:** - Media archives requiring multiple tracks - High-quality video storage - Projects requiring complex chapter structures ### Additional Container Formats - [MOV](mov.md) - Apple's QuickTime container format - [MPEG-TS](mpegts.md) - Transport Stream format optimized for broadcasting - [MXF](mxf.md) - Material Exchange Format used in professional video production - [Windows Media Video](wmv.md) - Microsoft's proprietary format ## Audio-Only Formats ### MP3 (MPEG-1 Audio Layer III) [MP3](../audio-encoders/mp3.md) remains one of the most widely supported audio formats. **Key features:** - Near-universal compatibility - Configurable bitrate for quality vs. size tradeoffs - VBR (Variable Bit Rate) option for optimized file sizes ### AAC in M4A Container [M4A](../audio-encoders/aac.md) provides better audio quality than MP3 at the same bitrate. **Key features:** - Better compression efficiency than MP3 - Good compatibility with modern devices - Supports advanced audio features like multichannel audio ### Other Audio Formats - [FLAC](../audio-encoders/flac.md) - Lossless audio format for high-quality archiving - [OGG Vorbis](../audio-encoders/vorbis.md) - Open-source alternative to MP3 with better quality at lower bitrates ## Specialized Formats ### GIF (Graphics Interchange Format) [GIF](gif.md) is useful for creating short, silent animations. **Key features:** - Wide web compatibility - Limited to 256 colors per frame - Support for transparency - Ideal for short, looping animations ### Custom Output Format [Custom output format](custom.md) allows integration with third-party DirectShow filters. **Key features:** - Maximum flexibility for specialized requirements - Integration with commercial or custom codecs - Support for proprietary formats ## Advanced Output Options ### FFMPEG Integration [FFMPEG EXE](ffmpeg-exe.md) integration provides access to FFMPEG's extensive codec library. **Key features:** - Support for virtually any format FFMPEG can handle - Advanced encoding options - Custom command line arguments for fine-tuned control ## Performance Optimization Tips When working with video output formats, consider these optimization strategies: 1. **Match format to use case** - Use streaming-optimized formats for web delivery 2. **Consider hardware acceleration** - Many modern codecs support GPU acceleration 3. **Use appropriate bitrates** - Higher isn't always better; find the sweet spot for your content 4. **Test on target devices** - Ensure compatibility before finalizing format choice 5. **Enable multi-threading** - Take advantage of multiple CPU cores for faster encoding ## Implementation Best Practices - Configure proper keyframe intervals for streaming formats - Set appropriate bitrate constraints for target platforms - Use two-pass encoding for highest quality output when time permits - Consider audio quality requirements alongside video format decisions ## Format Compatibility Matrix | Format | Windows | macOS | iOS | Android | Web Browsers | |--------|---------|-------|-----|---------|--------------| | MP4 (H.264) | ✓ | ✓ | ✓ | ✓ | ✓ | | WebM (VP9) | ✓ | ✓ | Partial | ✓ | ✓ | | MKV | ✓ | With players | With players | With players | ✗ | | AVI | ✓ | With players | Limited | Limited | ✗ | | MP3 | ✓ | ✓ | ✓ | ✓ | ✓ | --- Visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) for more code samples and implementation examples. Our documentation is continuously updated to reflect new features and optimizations available in the latest SDK releases. ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\mkv.md --- title: MKV Container Format for .NET Video Applications description: Learn how to implement MKV output in .NET applications with hardware-accelerated encoding, multiple audio tracks, and custom video processing. Master video and audio encoding options for high-performance multimedia applications. sidebar_label: MKV (Matroska) --- # MKV Output in VisioForge .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ## Introduction to MKV Format MKV (Matroska Video) is a flexible, open-standard container format that can hold an unlimited number of video, audio, and subtitle tracks in one file. The VisioForge SDKs provide robust support for MKV output with various encoding options to meet diverse development requirements. This format is particularly valuable for developers working on applications that require: - Multiple audio tracks or languages - High-quality video with multiple codec options - Cross-platform compatibility - Support for metadata and chapters ## Getting Started with MKV Output The `MKVOutput` class serves as the primary interface for generating MKV files in VisioForge SDKs. You can initialize it with default settings or specify custom encoders to match your application's needs. ### Basic Implementation ```csharp // Create MKV output with default encoders var mkvOutput = new MKVOutput("output.mkv"); // Or specify custom encoders during initialization var videoEncoder = new NVENCH264EncoderSettings(); var audioEncoder = new MFAACEncoderSettings(); var mkvOutput = new MKVOutput("output.mkv", videoEncoder, audioEncoder); ``` ## Video Encoding Options The MKV format supports multiple video codecs, giving developers flexibility in balancing quality, performance, and compatibility. VisioForge SDKs offer both software and hardware-accelerated encoders. ### H.264 Encoder Options H.264 remains one of the most widely supported video codecs, providing excellent compression and quality: - **OpenH264**: Software-based encoder, used as default when hardware acceleration isn't available - **NVENC H.264**: NVIDIA GPU-accelerated encoding for superior performance - **QSV H.264**: Intel Quick Sync Video technology for hardware acceleration - **AMF H.264**: AMD GPU-accelerated encoding option ### HEVC (H.265) Encoder Options For applications requiring higher compression efficiency or 4K content: - **MF HEVC**: Windows Media Foundation implementation (Windows-only) - **NVENC HEVC**: NVIDIA GPU acceleration for H.265 - **QSV HEVC**: Intel Quick Sync implementation for H.265 - **AMF HEVC**: AMD GPU acceleration for H.265 encoding ### Setting a Video Encoder ```csharp mkvOutput.Video = new NVENCH264EncoderSettings(); ``` ## Audio Encoding Options Audio quality is equally important for most applications. VisioForge SDKs provide several audio encoder options for MKV output: ### Supported Audio Codecs - **AAC Encoders**: - **VO AAC**: Default choice for non-Windows platforms - **AVENC AAC**: FFMPEG AAC implementation - **MF AAC**: Windows Media Foundation implementation (default on Windows) - **Alternative Audio Formats**: - **MP3**: Common format with wide compatibility - **Vorbis**: Open source audio codec - **OPUS**: Modern codec with excellent quality-to-size ratio ### Configuring Audio Encoding ```csharp // Platform-specific audio encoder selection #if NET_WINDOWS var aacSettings = new MFAACEncoderSettings { Bitrate = 192, SampleRate = 48000 }; mkvOutput.Audio = aacSettings; #else var aacSettings = new VOAACEncoderSettings { Bitrate = 192, SampleRate = 44100 }; mkvOutput.Audio = aacSettings; #endif // Or use OPUS for better quality at lower bitrates var opusSettings = new OPUSEncoderSettings { Bitrate = 128, Channels = 2 }; mkvOutput.Audio = opusSettings; ``` ## Advanced MKV Configuration ### Custom Video and Audio Processing For applications that require special processing, you can integrate custom MediaBlock processors: ```csharp // Add a video processor for effects or transformations var textOverlayBlock = new TextOverlayBlock(new TextOverlaySettings("Hello world!")); mkvOutput.CustomVideoProcessor = textOverlayBlock; // Add audio processing var volumeBlock = new VolumeBlock() { Level = 1.2 }; // Boost volume by 20% mkvOutput.CustomAudioProcessor = volumeBlock; ``` ### Sink Settings Management Control output file properties through the sink settings: ```csharp // Change output filename mkvOutput.Sink.Filename = "processed_output.mkv"; // Get current filename string currentFile = mkvOutput.GetFilename(); // Update filename with timestamp string timestamp = DateTime.Now.ToString("yyyyMMdd_HHmmss"); mkvOutput.SetFilename($"recording_{timestamp}.mkv"); ``` ## Integration with VisioForge SDK Components ### With Video Capture SDK ```csharp // Initialize capture core var captureCore = new VideoCaptureCoreX(); // Configure video and audio source // ... // Add MKV output to recording pipeline var mkvOutput = new MKVOutput("capture.mkv"); captureCore.Outputs_Add(mkvOutput, true); // Start recording await captureCore.StartAsync(); ``` ### With Video Edit SDK ```csharp // Initialize editing core var editCore = new VideoEditCoreX(); // Add input sources // ... // Configure MKV output with hardware acceleration var h265Encoder = new NVENCHEVCEncoderSettings { Bitrate = 10000 }; var mkvOutput = new MKVOutput("edited.mkv", h265Encoder); editCore.Output_Format = mkvOutput; // Process the file await editCore.StartAsync(); ``` ### With Media Blocks SDK ```csharp // Create a pipeline var pipeline = new MediaBlocksPipeline(); // Add source block var sourceBlock = // some block ## Interface Implementation // Configure MKV output var aacEncoder = new VOAACEncoderSettings(); var h264Encoder = new OpenH264EncoderSettings(); var mkvSinkSettings = new MKVSinkSettings("processed.mkv"); var mkvOutput = new MKVOutputBlock(mkvSinkSettings, h264Encoder, aacEncoder); // Connect blocks and run the pipeline pipeline.Connect(sourceBlock.VideoOutput, h264Encoder.Input); pipeline.Connect(h264Encoder.Output, mkvOutput.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(sourceBlock.AudioOutput, aacEncoder.Input); pipeline.Connect(aacEncoder.Output, mkvOutput.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(mkvOutput.Output, pipeline.Sink); // Start the pipeline await pipeline.StartAsync(); ``` ## Hardware Acceleration Benefits Hardware-accelerated encoding offers significant advantages for developers building real-time or batch processing applications: 1. **Reduced CPU Load**: Offloads encoding to dedicated hardware 2. **Faster Processing**: Up to 5-10x performance improvement 3. **Power Efficiency**: Lower energy consumption, important for mobile apps 4. **Higher Quality**: Some hardware encoders provide better quality-per-bitrate ## Best Practices for Developers When implementing MKV output in your applications, consider these recommendations: 1. **Always check hardware availability** before using GPU-accelerated encoders 2. **Select appropriate bitrates** based on content type and resolution 3. **Use platform-specific encoders** where possible for optimal performance 4. **Test on target platforms** to ensure compatibility 5. **Consider quality-size trade-offs** based on your application's needs ## Conclusion The MKV format provides developers with a flexible, robust container for video content in .NET applications. With VisioForge SDKs, you can leverage hardware acceleration, advanced encoding options, and custom processing to create high-performance video applications. By understanding the available encoders and configuration options, you can optimize your implementation for specific hardware platforms while maintaining cross-platform compatibility where needed. ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\mov.md --- title: MOV File Encoding with VisioForge .NET SDKs description: Learn how to implement high-performance MOV file output in your .NET applications using VisioForge SDKs. This developer guide covers hardware-accelerated encoding options, cross-platform implementation, audio/video configuration, and integration workflows for professional video applications. sidebar_label: MOV --- # MOV File Output for .NET Video Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ## Introduction to MOV Output in VisioForge The MOV container format is widely used for video storage in professional environments and Apple ecosystems. VisioForge's .NET SDKs provide robust cross-platform support for generating MOV files with customizable encoding options. The `MOVOutput` class serves as the primary interface for configuring and generating these files across Windows, macOS, and Linux environments. MOV files created with VisioForge SDKs can leverage hardware acceleration through NVIDIA, Intel, and AMD encoders, making them ideal for performance-critical applications. This guide walks through the essential steps for implementing MOV output in .NET video applications. ### When to Use MOV Format MOV is particularly well-suited for: - Video editing workflows - Projects requiring Apple ecosystem compatibility - Professional video production pipelines - Applications needing metadata preservation - High-quality archival purposes ## Getting Started with MOV Output The `MOVOutput` class ([API reference](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.Output.MOVOutput.html)) provides the foundation for MOV file generation with VisioForge SDKs. It encapsulates the configuration of video and audio encoders, processing parameters, and sink settings. ### Basic Implementation Creating a MOV output requires minimal code: ```csharp // Create a MOV output targeting the specified filename var movOutput = new MOVOutput("output.mov"); ``` This simple implementation automatically: - Selects NVENC H264 encoder if available (falls back to OpenH264) - Chooses the appropriate AAC encoder for your platform (MF AAC on Windows, VO-AAC elsewhere) - Configures MOV container settings for broad compatibility ### Default Configuration Behavior The default configuration delivers balanced performance and compatibility across platforms. However, for specialized use cases, you'll likely need to customize encoder settings, which we'll cover in the following sections. ## Video Encoder Options for MOV Files MOV output supports a variety of video encoders to accommodate different performance, quality, and compatibility requirements. The choice of encoder significantly impacts processing speed, resource consumption, and output quality. ### Supported Video Encoders The MOV output supports these video encoders: | Encoder | Technology | Platform | Best For | |---------|------------|----------|----------| | OpenH264 | Software | Cross-platform | Compatibility | | NVENC H264 | NVIDIA GPU | Cross-platform | Performance | | QSV H264 | Intel GPU | Cross-platform | Efficiency | | AMF H264 | AMD GPU | Cross-platform | Performance | | MF HEVC | Software | Windows only | Quality | | NVENC HEVC | NVIDIA GPU | Cross-platform | Quality/Performance | | QSV HEVC | Intel GPU | Cross-platform | Efficiency | | AMF H265 | AMD GPU | Cross-platform | Quality/Performance | ### Configuring Video Encoders Set a specific video encoder with code like this: ```csharp // For NVIDIA hardware-accelerated encoding movOutput.Video = new NVENCH264EncoderSettings() { Bitrate = 5000000, // 5 Mbps }; // For software-based encoding with OpenH264 movOutput.Video = new OpenH264EncoderSettings() { RateControl = RateControlMode.VBR, Bitrate = 2500000 // 2.5 Mbps }; ``` ### Encoder Selection Strategy When implementing MOV output, consider these factors for encoder selection: 1. **Hardware availability** - Check if GPU acceleration is available 2. **Quality requirements** - HEVC offers better quality at lower bitrates 3. **Processing speed** - Hardware encoders provide significant speed advantages 4. **Platform compatibility** - Some encoders are Windows-specific A multi-tier approach often works best, checking for the fastest available encoder and falling back as needed: ```csharp // Try NVIDIA, then Intel, then software encoding if (NVENCH264EncoderSettings.IsAvailable()) { movOutput.Video = new NVENCH264EncoderSettings(); } else if (QSVH264EncoderSettings.IsAvailable()) { movOutput.Video = new QSVH264EncoderSettings(); } else { movOutput.Video = new OpenH264EncoderSettings(); } ``` ## Audio Encoder Options Audio quality is critical for most video applications. The SDK provides several audio encoders optimized for different use cases. ### Supported Audio Encoders | Encoder | Type | Platform | Quality | Use Case | |---------|------|----------|---------|----------| | MP3 | Software | Cross-platform | Good | Web distribution | | VO-AAC | Software | Cross-platform | Excellent | Professional use | | AVENC AAC | Software | Cross-platform | Very good | General purpose | | MF AAC | Hardware-accelerated | Windows only | Excellent | Windows apps | ### Audio Encoder Configuration Implementing audio encoding requires minimal code: ```csharp // MP3 configuration movOutput.Audio = new MP3EncoderSettings() { Bitrate = 320000, // 320 kbps high quality Channels = 2 // Stereo }; // Or AAC for better quality (Windows) movOutput.Audio = new MFAACEncoderSettings() { Bitrate = 192000 // 192 kbps }; // Cross-platform AAC implementation movOutput.Audio = new VOAACEncoderSettings() { Bitrate = 192000, SampleRate = 48000 }; ``` ### Platform-Specific Audio Considerations To handle platform differences elegantly, use conditional compilation: ```csharp // Select appropriate encoder based on platform #if NET_WINDOWS movOutput.Audio = new MFAACEncoderSettings(); #else movOutput.Audio = new VOAACEncoderSettings(); #endif ``` ## Advanced MOV Output Customization Beyond basic configuration, VisioForge SDKs enable powerful customization of MOV output through media processing blocks and sink settings. ### Custom Processing Pipeline For specialized video processing needs, the SDK provides media block integration: ```csharp // Add custom video processing movOutput.CustomVideoProcessor = new SomeMediaBlock(); // Add custom audio processing movOutput.CustomAudioProcessor = new SomeMediaBlock(); ``` ### MOV Sink Configuration Fine-tune the MOV container settings for specialized requirements: ```csharp // Configure sink settings movOutput.Sink.Filename = "new_output.mov"; ``` ### Dynamic Encoder Detection Your application can intelligently select encoders based on system capabilities: ```csharp // Get available video encoders var videoEncoders = movOutput.GetVideoEncoders(); // Get available audio encoders var audioEncoders = movOutput.GetAudioEncoders(); // Display available options to users or auto-select foreach (var encoder in videoEncoders) { Console.WriteLine($"Available encoder: {encoder.Name}"); } ``` ## Integration with VisioForge SDK Core Components The MOV output integrates seamlessly with the core SDK components for video capture, editing, and processing. ### Video Capture Integration Add MOV output to a capture workflow: ```csharp // Create and configure capture core var core = new VideoCaptureCoreX(); // Add capture devices // .. // Add configured MOV output core.Outputs_Add(movOutput, true); // Start capture await core.StartAsync(); ``` ### Video Edit SDK Integration Incorporate MOV output in video editing: ```csharp // Create edit core and configure project var core = new VideoEditCoreX(); // Add input file // ... // Set MOV as output format core.Output_Format = movOutput; // Process the video await core.StartAsync(); ``` ### Media Blocks SDK Implementation For direct media pipeline control: ```csharp // Create encoder instances var aac = new VOAACEncoderSettings(); var h264 = new OpenH264EncoderSettings(); // Configure MOV sink var movSinkSettings = new MOVSinkSettings("output.mov"); // Create output block // Note: MP4OutputBlock handles MOV output (MOV is a subset of MP4) var movOutput = new MP4OutputBlock(movSinkSettings, h264, aac); // Add to pipeline pipeline.AddBlock(movOutput); ``` ## Platform Compatibility Notes While VisioForge's MOV implementation is cross-platform, some features are platform-specific: ### Windows-Specific Features - MF HEVC video encoder provides optimized encoding on Windows - MF AAC audio encoder offers hardware acceleration on compatible systems ### Cross-Platform Features - OpenH264, NVENC, QSV, and AMF encoders work across operating systems - VO-AAC and AVENC AAC provide consistent audio encoding everywhere ## Conclusion The MOV output capability in VisioForge .NET SDKs provides a powerful and flexible solution for creating high-quality video files. By leveraging hardware acceleration where available and falling back to optimized software implementations when needed, the SDK ensures excellent performance across platforms. For more information, refer to the [VisioForge API documentation](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.Output.MOVOutput.html) or explore other output formats in our documentation. ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\mp4.md --- title: MP4 Video Output Integration for .NET description: Learn how to implement MP4 video output in .NET applications using hardware-accelerated encoders. Guide covers H.264/HEVC encoding, audio configuration, and best practices for optimal video processing performance. sidebar_label: MP4 --- # MP4 file output [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) MP4 (MPEG-4 Part 14), introduced in 2001, is a digital multimedia container format most commonly used to store video and audio. It also supports subtitles and images. MP4 is known for its high compression and compatibility across various devices and platforms, making it a popular choice for streaming and sharing. Capturing videos from a webcam and saving them to a file is a common requirement in many applications. One way to achieve this is by using a software development kit (SDK) like VisioForge Video Capture SDK .Net, which provides an easy-to-use API for capturing and processing videos in C#. To capture video in MP4 format using Video Capture SDK, you need to configure video output format using one of the classes for MP4 output. You can use several available software and hardware video encoders, including Intel QuickSync, Nvidia NVENC, and AMD/ATI APU. ## Cross-platform MP4 output [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The [MP4Output](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.Output.MP4Output.html?q=MP4Output) class provides a flexible and powerful way to configure MP4 video output settings for video capture and editing operations. This guide will walk you through how to use the MP4Output class effectively, covering its key features and common usage patterns. MP4Output implements several important interfaces: - IVideoEditXBaseOutput - IVideoCaptureXBaseOutput - Media Block creation This makes it suitable for both video editing and capture scenarios while providing extensive control over video and audio processing. ### Basic Usage The simplest way to create an MP4Output instance is using the constructor with a filename: ```csharp var output = new MP4Output("output.mp4"); ``` This creates an MP4Output with default video and audio encoder settings. On Windows, it will use OpenH264 for video encoding and Media Foundation AAC for audio encoding by default. ### Video Encoder Configuration The MP4Output class supports multiple video encoders through its `Video` property. Here are the supported video encoders: **[H.264 Encoders](../video-encoders/h264.md)** - OpenH264EncoderSettings (Default, CPU) - AMFH264EncoderSettings (AMD) - NVENCH264EncoderSettings (NVIDIA) - QSVH264EncoderSettings (Intel Quick Sync) **[HEVC (H.265) Encoders](../video-encoders/hevc.md)** - MFHEVCEncoderSettings (Windows only) - AMFH265EncoderSettings (AMD) - NVENCHEVCEncoderSettings (NVIDIA) - QSVHEVCEncoderSettings (Intel Quick Sync) You can check the availability of specific encoders using the `IsAvailable` method: ```csharp if (NVENCH264EncoderSettings.IsAvailable()) { output.Video = new NVENCH264EncoderSettings(); } ``` Example of configuring a specific video encoder: ```csharp var output = new MP4Output("output.mp4"); output.Video = new NVENCH264EncoderSettings(); // Use NVIDIA encoder ``` ### Audio Encoder Configuration The `Audio` property allows you to specify the audio encoder. Supported audio encoders include: - [VOAACEncoderSettings](../audio-encoders/aac.md) - [AVENCAACEncoderSettings](../audio-encoders/aac.md) - [MFAACEncoderSettings](../audio-encoders/aac.md) (Windows only) - [MP3EncoderSettings](../audio-encoders/mp3.md) Example of setting a custom audio encoder: ```csharp var output = new MP4Output("output.mp4"); output.Audio = new MP3EncoderSettings(); ``` The MP4Output class automatically selects appropriate default encoders based on the platform. ### Sample code Add the MP4 output to the Video Capture SDK core instance: ```csharp var core = new VideoCaptureCoreX(); core.Outputs_Add(output, true); ``` Set the output format for the Video Edit SDK core instance: ```csharp var core = new VideoEditCoreX(); core.Output_Format = output; ``` Create a Media Blocks MP4 output instance: ```csharp var aac = new VOAACEncoderSettings(); var h264 = new OpenH264EncoderSettings(); var mp4SinkSettings = new MP4SinkSettings("output.mp4"); var mp4Output = new MP4OutputBlock(mp4SinkSettings, h264, aac); ``` ### Best Practices **Hardware Acceleration**: When possible, use hardware-accelerated encoders (NVENC, AMF, QSV) for better performance: ```csharp var output = new MP4Output("output.mp4"); if (NVENCH264EncoderSettings.IsAvailable()) { output.Video = new NVENCH264EncoderSettings(); } ``` **Encoder Selection**: Use the provided methods to enumerate available encoders: ```csharp var output = new MP4Output("output.mp4"); var availableVideoEncoders = output.GetVideoEncoders(); var availableAudioEncoders = output.GetAudioEncoders(); ``` ### Common Issues and Solutions 1. **File Access**: The MP4Output constructor attempts to verify write access by creating and immediately deleting a test file. Ensure the application has proper permissions to the output directory. 2. **Encoder Availability**: Hardware encoders might not be available on all systems. Always provide a fallback: ```csharp var output = new MP4Output("output.mp4"); if (!NVENCH264EncoderSettings.IsAvailable()) { output.Video = new OpenH264EncoderSettings(); // Fallback to software encoder } ``` 3. **Platform Compatibility**: Some encoders are platform-specific. Use conditional compilation or runtime checks when targeting multiple platforms: ```csharp #if NET_WINDOWS output.Audio = new MFAACEncoderSettings(); #else output.Audio = new MP3EncoderSettings(); #endif ``` ## Windows-only MP4 output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] `The same sample code can be used for Video Edit SDK .Net. Use the VideoEditCore class instead of VideoCaptureCore.` ### CPU encoder or Intel QuickSync GPU encoder Create an `MP4Output` object for MP4 output. ```cs var mp4Output = new MP4Output(); ``` Set MP4 mode to `CPU_QSV`. ```cs mp4Output.MP4Mode = MP4Mode.CPU_QSV; ``` Set video settings. ```cs mp4Output.Video.Profile = H264Profile.ProfileMain; // H264 profile mp4Output.Video.Level = H264Level.Level4; // H264 level mp4Output.Video.Bitrate = 2000; // bitrate // optional parameters mp4Output.Video.MBEncoding = H264MBEncoding.CABAC; //CABAC / CAVLC mp4Output.Video.BitrateAuto = false; // true to use auto bitrate mp4Output.Video.RateControl = H264RateControl.VBR; // rate control - CBR or VBR ``` Set AAC audio settings. ```cs mp4Output.Audio_AAC.Bitrate = 192; mp4Output.Audio_AAC.Version = AACVersion.MPEG4; // MPEG-4 / MPEG-2 mp4Output.Audio_AAC.Output = AACOutput.RAW; // RAW or ADTS mp4Output.Audio_AAC.Object = AACObject.Low; // type of AAC ``` ### Nvidia NVENC encoder Create the `MP4Output` object for MP4 output. ```cs var mp4Output = new MP4Output(); ``` Set MP4 mode to `NVENC`. ```cs mp4Output.MP4Mode = MP4Mode.NVENC; ``` Set the video settings. ```cs mp4Output.Video_NVENC.Profile = NVENCVideoEncoderProfile.H264_Main; // H264 profile mp4Output.Video_NVENC.Level = NVENCEncoderLevel.H264_4; // H264 level mp4Output.Video_NVENC.Bitrate = 2000; // bitrate // optional parameters mp4Output.Video_NVENC.RateControl = NVENCRateControlMode.VBR; // rate control - CBR or VBR ``` Set the audio settings. ```cs mp4Output.Audio_AAC.Bitrate = 192; mp4Output.Audio_AAC.Version = AACVersion.MPEG4; // MPEG-4 / MPEG-2 mp4Output.Audio_AAC.Output = AACOutput.RAW; // RAW or ADTS mp4Output.Audio_AAC.Object = AACObject.Low; // type of AAC ``` ### CPU/GPU encoders Using MP4 HW output, you can use hardware-accelerated encoders by Intel (QuickSync), Nvidia (NVENC), and AMD/ATI. Create `MP4HWOutput` object for MP4 HW output. ```cs var mp4Output = new MP4HWOutput(); ``` Get available encoders. ```cs var availableEncoders = VideoCaptureCore.HWEncodersAvailable(); // or var availableEncoders = VideoEditCore.HWEncodersAvailable(); ``` Depending on available encoders, select video codec. ```cs mp4Output.Video.Codec = MFVideoEncoder.MS_H264; // Microsoft H264 mp4Output.Video.Profile = MFH264Profile.Main; // H264 profile mp4Output.Video.Level = MFH264Level.Level4; // H264 level mp4Output.Video.AvgBitrate = 2000; // bitrate // optional parameters mp4Output.Video.CABAC = true; // CABAC / CAVLC mp4Output.Video.RateControl = MFCommonRateControlMode.CBR; // rate control // many other parameters are available ``` Set audio settings. ```cs mp4Output.Audio.Bitrate = 192; mp4Output.Audio.Version = AACVersion.MPEG4; // MPEG-4 / MPEG-2 mp4Output.Audio.Output = AACOutput.RAW; // RAW or ADTS mp4Output.Audio.Object = AACObject.Low; // type of AAC ``` Now, we can apply MP4 output settings to the core class (VideoCaptureCore or VideoEditCore) and start video capture or editing. ### Apply video capture settings Set MP4 format settings for output. ```cs core.Output_Format = mp4Output; ``` Set a video capture mode (or video convert mode if you use Video Edit SDK). ```cs core.Mode = VideoCaptureMode.VideoCapture; ``` Set a file name (ensure you have to write access rights). ```cs core.Output_Filename = "output.mp4"; ``` Start video capture (convert) to a file. ```cs await VideoCapture1.StartAsync(); ``` Finally, when we're done capturing the video, we need to stop the video capture and release the resources. We can do this by calling the `StopAsync` method of the `VideoCaptureCore` class. ### Required redists - Video Capture SDK redist [x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x86/) [x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x64/) - Video Edit SDK redist [x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoEdit.x86/) [x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoEdit.x64/) - MP4 redist [x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MP4.x86/) [x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MP4.x64/) --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\mpegts.md --- title: MPEG-TS File Output Guide for .NET description: Learn how to implement MPEG Transport Stream (MPEG-TS) file output in .NET applications. Covers video and audio encoding options, hardware acceleration, cross-platform considerations, and best practices for developers working with media streaming. sidebar_label: MPEG-TS --- # MPEG-TS Output [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The MPEG-TS (Transport Stream) output module in VisioForge SDK functionality for creating MPEG transport stream files with various video and audio encoding options. This guide explains how to configure and use the `MPEGTSOutput` class effectively. ## Cross-platform MPEG-TS output [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] To create a new MPEG-TS output, use the following constructor: ```csharp // Initialize with AAC audio (recommended) var output = new MPEGTSOutput("output.ts", useAAC: true); ``` You can also use MP3 audio instead of AAC: ```csharp // Initialize with MP3 audio instead of AAC var output = new MPEGTSOutput("output.ts", useAAC: false); ``` ### Video Encoding Options The [MPEGTSOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.Output.MPEGTSOutput.html) supports multiple video encoders through the `Video` property. Available encoders include: **[H.264 Encoders](../video-encoders/h264.md)** - OpenH264 (Software-based) - NVENC H.264 (NVIDIA GPU acceleration) - QSV H.264 (Intel Quick Sync) - AMF H.264 (AMD GPU acceleration) **[H.265/HEVC Encoders](../video-encoders/hevc.md)** - MF HEVC (Windows Media Foundation, Windows only) - NVENC HEVC (NVIDIA GPU acceleration) - QSV HEVC (Intel Quick Sync) - AMF H.265 (AMD GPU acceleration) Example of setting a specific video encoder: ```csharp // Check if NVIDIA encoder is available if (NVENCH264EncoderSettings.IsAvailable()) { output.Video = new NVENCH264EncoderSettings(); } else { // Fall back to OpenH264 output.Video = new OpenH264EncoderSettings(); } ``` ### Audio Encoding Options The following audio encoders are supported through the `Audio` property: **[AAC Encoders](../audio-encoders/aac.md)** - VO-AAC (Cross-platform) - AVENC AAC - MF AAC (Windows only) **[MP3 Encoder](../audio-encoders/mp3.md)**: - MP3EncoderSettings Example of setting an audio encoder: ```csharp // For Windows platforms output.Audio = new MFAACEncoderSettings(); ``` ```csharp // For cross-platform compatibility output.Audio = new VOAACEncoderSettings(); ``` ```csharp // Using MP3 instead of AAC output.Audio = new MP3EncoderSettings(); ``` ### File Management You can get or set the output filename after initialization: ```csharp // Get current filename string currentFile = output.GetFilename(); // Change output filename output.SetFilename("new_output.ts"); ``` ### Advanced Features #### Custom Processing The MPEGTSOutput supports custom video and audio processing through MediaBlocks: ```csharp // Add custom video processing output.CustomVideoProcessor = new YourCustomVideoProcessor(); // Add custom audio processing output.CustomAudioProcessor = new YourCustomAudioProcessor(); ``` #### Sink Settings The output uses MP4SinkSettings for configuration: ```csharp // Access sink settings output.Sink.Filename = "modified_output.ts"; ``` ### Platform Considerations - Some encoders (MF AAC, MF HEVC) are only available on Windows platforms - Cross-platform applications should use platform-agnostic encoders like VO-AAC for audio ### Best Practices 1. **Hardware Acceleration**: When available, prefer hardware-accelerated encoders (NVENC, QSV, AMF) over software encoders for better performance. 2. **Audio Codec Selection**: Use AAC for better compatibility and quality unless you have specific requirements for MP3. 3. **Error Handling**: Always check for encoder availability before using hardware-accelerated options: ```csharp if (NVENCH264EncoderSettings.IsAvailable()) { // Use NVIDIA encoder } else if (QSVH264EncoderSettings.IsAvailable()) { // Fall back to Intel Quick Sync } else { // Fall back to software encoding } ``` **Cross-Platform Compatibility**: For cross-platform applications, ensure you're using encoders available on all target platforms or implement appropriate fallbacks. ### Implementation Example Here's a complete example showing how to create and configure an MPEG-TS output: ```csharp var output = new MPEGTSOutput("output.ts", useAAC: true); // Configure video encoder if (NVENCH264EncoderSettings.IsAvailable()) { output.Video = new NVENCH264EncoderSettings(); } else if (QSVH264EncoderSettings.IsAvailable()) { output.Video = new QSVH264EncoderSettings(); } else { output.Video = new OpenH264EncoderSettings(); } // Configure audio encoder based on platform #if NET_WINDOWS output.Audio = new MFAACEncoderSettings(); #else output.Audio = new VOAACEncoderSettings(); #endif // Optional: Add custom processing output.CustomVideoProcessor = new YourCustomVideoProcessor(); output.CustomAudioProcessor = new YourCustomAudioProcessor(); ``` ## Windows-only MPEG-TS output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The `MPEGTSOutput` class provides configuration settings for MPEG Transport Stream (MPEG-TS) output in the VisioForge video processing framework. This class inherits from `MFBaseOutput` and implements the `IVideoCaptureBaseOutput` interface, enabling it to be used specifically for video capture scenarios with MPEG-TS formatting. ### Class Hierarchy ```text MFBaseOutput └── MPEGTSOutput ``` ### Inherited Video Settings The [MPEGTSOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.MPEGTSOutput.html) class inherits video encoding capabilities from [MFBaseOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.MFBaseOutput.html), which includes: **Video Encoding Configuration**: Through the `Video` property of type [MFVideoEncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.MFVideoEncoderSettings.html), supporting: - Multiple codec options (H.264/H.265) with hardware acceleration support - Bitrate control (CBR/VBR) - Quality settings - Frame type and GOP structure configuration - Interlacing options - Resolution and aspect ratio controls ### Inherited Audio Settings Audio configuration is handled through the inherited `Audio` property of type [M4AOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.M4AOutput.html), which includes: AAC audio encoding with configurable: - Version (default: MPEG-4) - Object type (default: AAC-LC) - Bitrate (default: 128 kbps) - Output format (default: RAW) ### Usage #### Basic Implementation ```csharp // Create VideoCaptureCore instance var core = new VideoCaptureCore(); // Set output filename core.Output_Filename = "output.ts"; // Create MPEG-TS output var mpegtsOutput = new MPEGTSOutput(); // Configure video settings mpegtsOutput.Video.Codec = MFVideoEncoder.MS_H264; mpegtsOutput.Video.AvgBitrate = 2000; // 2 Mbps mpegtsOutput.Video.RateControl = MFCommonRateControlMode.CBR; // Configure audio settings mpegtsOutput.Audio.Bitrate = 128; // 128 kbps mpegtsOutput.Audio.Version = AACVersion.MPEG4; core.Output_Format = mpegtsOutput; ``` #### Serialization Support The class provides built-in JSON serialization support for saving and loading configurations: ```csharp // Save configuration string jsonConfig = mpegtsOutput.Save(); // Load configuration MPEGTSOutput loadedConfig = MPEGTSOutput.Load(jsonConfig); ``` ### Default Configuration The `MPEGTSOutput` class initializes with these default settings: #### Video Defaults (inherited from MFBaseOutput) - Average Bitrate: 2000 kbps - Codec: Microsoft H.264 - Profile: Main - Level: 4.2 - Rate Control: CBR - Quality vs Speed: 85 - Maximum Reference Frames: 2 - GOP Size: 50 frames - B-Picture Count: 0 - Low Latency Mode: Disabled - CABAC: Disabled - Interlace Mode: Progressive #### Audio Defaults - Bitrate: 128 kbps - AAC Version: MPEG-4 - AAC Object Type: Low Complexity (LC) - Output Format: RAW ### Best Practices 1. **Bitrate Configuration**: - For streaming applications, ensure the combined video and audio bitrates are within your target bandwidth - Consider using VBR for storage scenarios and CBR for streaming 2. **Hardware Acceleration**: - When available, use hardware-accelerated encoders (QSV, NVENC, AMD) for better performance - Fall back to MS_H264/MS_H265 when hardware acceleration is unavailable 3. **Quality Optimization**: - For higher quality at the cost of performance, increase the `QualityVsSpeed` value - Enable CABAC for better compression efficiency in non-low-latency scenarios - Adjust `MaxKeyFrameSpacing` based on your specific use case (lower values for better seeking, higher values for better compression) ### Technical Notes 1. **MPEG-TS Characteristics**: - Suitable for streaming and broadcasting applications - Provides error resilience through packet-based structure - Supports multiple programs and elementary streams 2. **Performance Considerations**: - Low latency mode trades quality for reduced encoding delay - B-frames improve compression but increase latency - Hardware acceleration can significantly reduce CPU usage ### Required redists - Video Capture SDK redist [x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x86/) [x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoCapture.x64/) - Video Edit SDK redist [x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoEdit.x86/) [x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VideoEdit.x64/) - MP4 redist [x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MP4.x86/) [x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MP4.x64/) --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\mxf.md --- title: Professional MXF Integration for .NET Applications description: Master MXF output implementation in VisioForge SDKs with detailed code samples for professional video workflows. Learn hardware acceleration, codec optimization, cross-platform considerations, and best practices for broadcast-ready MXF files in your .NET applications. sidebar_label: MXF --- # MXF Output in VisioForge .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] Material Exchange Format (MXF) is an industry-standard container format designed for professional video applications. It's widely adopted in broadcast environments, post-production workflows, and archival systems. VisioForge SDKs provide robust cross-platform MXF output capabilities that allow developers to integrate this professional format into their applications. ## Understanding MXF Format MXF serves as a wrapper that can contain various types of video and audio data along with metadata. The format was designed to address interoperability issues in professional video workflows: - **Industry Standard**: Adopted by major broadcasters worldwide - **Professional Metadata**: Supports extensive technical and descriptive metadata - **Versatile Container**: Compatible with numerous audio and video codecs - **Cross-Platform**: Supported across Windows, macOS, and Linux ## Getting Started with MXF Output Implementing MXF output in VisioForge SDKs requires just a few steps. The basic setup involves: 1. Creating an MXF output object 2. Specifying video and audio stream types 3. Configuring encoder settings 4. Adding the output to your pipeline ### Basic Implementation Here's the foundational code to create an MXF output: ```csharp var mxfOutput = new MXFOutput( filename: "output.mxf", videoStreamType: MXFVideoStreamType.H264, audioStreamType: MXFAudioStreamType.MPEG ); ``` This minimal implementation creates a valid MXF file with default encoding settings. For professional applications, you'll typically want to customize the encoding parameters further. ## Video Encoding Options for MXF The quality and compatibility of your MXF output largely depends on your choice of video encoder. VisioForge SDKs support multiple encoder options to balance performance, quality, and compatibility. ### Hardware-Accelerated Encoders For optimal performance in real-time applications, hardware-accelerated encoders are recommended: #### NVIDIA NVENC Encoders ```csharp // Check availability first if (NVENCH264EncoderSettings.IsAvailable()) { var nvencSettings = new NVENCH264EncoderSettings { Bitrate = 8000000, // 8 Mbps }; mxfOutput.Video = nvencSettings; } ``` #### Intel Quick Sync Video (QSV) Encoders ```csharp if (QSVH264EncoderSettings.IsAvailable()) { var qsvSettings = new QSVH264EncoderSettings { Bitrate = 8000000, }; mxfOutput.Video = qsvSettings; } ``` #### AMD Advanced Media Framework (AMF) Encoders ```csharp if (AMFH264EncoderSettings.IsAvailable()) { var amfSettings = new AMFH264EncoderSettings { Bitrate = 8000000, }; mxfOutput.Video = amfSettings; } ``` ### Software-Based Encoders When hardware acceleration isn't available, software encoders provide reliable alternatives: #### OpenH264 Encoder ```csharp var openH264Settings = new OpenH264EncoderSettings { Bitrate = 8000000, }; mxfOutput.Video = openH264Settings; ``` ### High-Efficiency Video Coding (HEVC/H.265) For applications requiring higher compression efficiency: ```csharp // NVIDIA HEVC encoder if (NVENCHEVCEncoderSettings.IsAvailable()) { var nvencHevcSettings = new NVENCHEVCEncoderSettings { Bitrate = 5000000, // Lower bitrate possible with HEVC }; mxfOutput.Video = nvencHevcSettings; } ``` ## Audio Encoding for MXF Files While video often gets the most attention, proper audio encoding is crucial for professional MXF outputs. VisioForge SDKs offer multiple audio encoder options: ### AAC Encoders AAC is the preferred codec for most professional applications: ```csharp // Media Foundation AAC (Windows-only) #if NET_WINDOWS var mfAacSettings = new MFAACEncoderSettings { Bitrate = 192000, // 192 kbps SampleRate = 48000 // Professional standard }; mxfOutput.Audio = mfAacSettings; #else // Cross-platform AAC alternative var voAacSettings = new VOAACEncoderSettings { Bitrate = 192000, SampleRate = 48000 }; mxfOutput.Audio = voAacSettings; #endif ``` ### MP3 Encoder For maximum compatibility: ```csharp var mp3Settings = new MP3EncoderSettings { Bitrate = 320000, // 320 kbps SampleRate = 48000, ChannelMode = MP3ChannelMode.Stereo }; mxfOutput.Audio = mp3Settings; ``` ## Advanced MXF Configuration ### Custom Processing Pipelines One of the powerful features of VisioForge SDKs is the ability to add custom processing to your MXF output chain: ```csharp // Add custom video processing mxfOutput.CustomVideoProcessor = yourVideoProcessingBlock; // Add custom audio processing mxfOutput.CustomAudioProcessor = yourAudioProcessingBlock; ``` ### Sink Configuration Fine-tune your MXF output with sink settings: ```csharp // Access sink settings mxfOutput.Sink.Filename = "new_output.mxf"; ``` ## Cross-Platform Considerations Building applications that work across different platforms requires careful planning: ```csharp // Platform-specific encoder selection var mxfOutput = new MXFOutput( filename: "output.mxf", videoStreamType: MXFVideoStreamType.H264, audioStreamType: MXFAudioStreamType.MPEG ); #if NET_WINDOWS // Windows-specific settings if (QSVH264EncoderSettings.IsAvailable()) { mxfOutput.Video = new QSVH264EncoderSettings(); mxfOutput.Audio = new MFAACEncoderSettings(); } #elif NET_MACOS // macOS-specific settings mxfOutput.Video = new OpenH264EncoderSettings(); mxfOutput.Audio = new VOAACEncoderSettings(); #else // Linux fallback mxfOutput.Video = new OpenH264EncoderSettings(); mxfOutput.Audio = new MP3EncoderSettings(); #endif ``` ## Error Handling and Validation Robust MXF implementations require proper error handling: ```csharp try { // Create MXF output var mxfOutput = new MXFOutput( filename: Path.Combine(outputDirectory, "output.mxf"), videoStreamType: MXFVideoStreamType.H264, audioStreamType: MXFAudioStreamType.MPEG ); // Validate encoder availability if (!OpenH264EncoderSettings.IsAvailable()) { throw new ApplicationException("No compatible H.264 encoder found"); } // Validate output directory var directoryInfo = new DirectoryInfo(Path.GetDirectoryName(mxfOutput.Sink.Filename)); if (!directoryInfo.Exists) { Directory.CreateDirectory(directoryInfo.FullName); } pipeline.AddBlock(mxfOutput); // Connect blocks // ... } catch (Exception ex) { logger.LogError($"MXF output error: {ex.Message}"); // Implement fallback strategy } ``` ## Performance Optimization For optimal MXF output performance: 1. **Prioritize Hardware Acceleration**: Always check for and use hardware encoders first 2. **Buffer Management**: Adjust buffer sizes based on system capabilities 3. **Parallel Processing**: Utilize multi-threading where appropriate 4. **Preset Selection**: Choose encoder presets based on quality vs. speed requirements ## Complete Implementation Example Here's a full example demonstrating MXF implementation with fallback options: ```csharp // Create MXF output with specific stream types var mxfOutput = new MXFOutput( filename: "output.mxf", videoStreamType: MXFVideoStreamType.H264, audioStreamType: MXFAudioStreamType.MPEG ); // Configure video encoder with prioritized fallback chain if (NVENCH264EncoderSettings.IsAvailable()) { var nvencSettings = new NVENCH264EncoderSettings { Bitrate = 8000000, }; mxfOutput.Video = nvencSettings; } else if (QSVH264EncoderSettings.IsAvailable()) { var qsvSettings = new QSVH264EncoderSettings { Bitrate = 8000000, }; mxfOutput.Video = qsvSettings; } else if (AMFH264EncoderSettings.IsAvailable()) { var amfSettings = new AMFH264EncoderSettings { Bitrate = 8000000, }; mxfOutput.Video = amfSettings; } else { // Software fallback var openH264Settings = new OpenH264EncoderSettings { Bitrate = 8000000, }; mxfOutput.Video = openH264Settings; } // Configure platform-optimized audio #if NET_WINDOWS mxfOutput.Audio = new MFAACEncoderSettings { Bitrate = 192000, SampleRate = 48000 }; #else mxfOutput.Audio = new VOAACEncoderSettings { Bitrate = 192000, SampleRate = 48000 }; #endif // Add to pipeline and start pipeline.AddBlock(mxfOutput); // Connect blocks // ... // Start the pipeline await pipeline.StartAsync(); ``` By following this guide, you can implement professional-grade MXF output in your applications using VisioForge .NET SDKs, ensuring compatibility with broadcast workflows and post-production systems. ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\webm.md --- title: WebM Video Output for .NET - Developer Guide description: Master WebM video implementation in .NET with detailed code examples for VP8, VP9, and AV1 codecs. Learn optimization strategies for quality, performance, and file size to create efficient web-ready videos across Windows and cross-platform applications. sidebar_label: WebM --- # WebM Video Output in VisioForge .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## What is WebM? WebM is an open-source, royalty-free media file format optimized for web delivery. Developed to provide efficient video streaming with minimal processing requirements, WebM has become a standard for HTML5 video content. The format supports modern codecs including VP8 and VP9 for video compression, along with Vorbis and Opus for audio encoding. The key advantages of WebM include: - **Web-optimized performance** with fast loading times - **Broad browser support** across major platforms - **High-quality video** at smaller file sizes - **Open-source licensing** without royalty costs - **Efficient streaming** capabilities for media applications ## Windows Implementation [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] On Windows platforms, VisioForge's implementation leverages the [WebMOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.WebMOutput.html) class from the `VisioForge.Core.Types.Output` namespace. ### Basic Configuration To quickly implement WebM output in your Windows application: ```csharp using VisioForge.Core.Types.Output; // Initialize WebM output settings var webmOutput = new WebMOutput(); // Configure essential parameters webmOutput.Video_Mode = VP8QualityMode.Realtime; webmOutput.Video_EndUsage = VP8EndUsageMode.VBR; webmOutput.Video_Encoder = WebMVideoEncoder.VP8; webmOutput.Video_Bitrate = 2000; webmOutput.Audio_Quality = 80; // Apply to your core instance var core = new VideoCaptureCore(); // or VideoEditCore core.Output_Format = webmOutput; core.Output_Filename = "output.webm"; ``` ### Video Quality Settings Fine-tuning your WebM video quality involves balancing several parameters: ```csharp var webmOutput = new WebMOutput(); // Quality parameters webmOutput.Video_MinQuantizer = 4; // Lower values = higher quality (range: 0-63) webmOutput.Video_MaxQuantizer = 48; // Upper quality bound (range: 0-63) webmOutput.Video_Bitrate = 2000; // Target bitrate in kbps // Encode with multiple threads for better performance webmOutput.Video_ThreadCount = 4; // Adjust based on available CPU cores ``` ### Keyframe Control Proper keyframe configuration is crucial for efficient streaming and seeking: ```csharp // Keyframe settings webmOutput.Video_Keyframe_MinInterval = 30; // Minimum frames between keyframes webmOutput.Video_Keyframe_MaxInterval = 300; // Maximum frames between keyframes webmOutput.Video_Keyframe_Mode = VP8KeyframeMode.Auto; ``` ### Performance Optimization Balance encoding speed and quality with these parameters: ```csharp // Performance settings webmOutput.Video_CPUUsed = 0; // Range: -16 to 16 (higher = faster encoding, lower quality) webmOutput.Video_LagInFrames = 25; // Frame look-ahead buffer (higher = better quality) webmOutput.Video_ErrorResilient = true; // Enable for streaming applications ``` ### Buffer Management For streaming applications, proper buffer configuration improves playback stability: ```csharp // Buffer settings webmOutput.Video_Decoder_Buffer_Size = 6000; // Buffer size in milliseconds webmOutput.Video_Decoder_Buffer_InitialSize = 4000; // Initial buffer fill level webmOutput.Video_Decoder_Buffer_OptimalSize = 5000; // Target buffer level // Rate control fine-tuning webmOutput.Video_UndershootPct = 50; // Allows bitrate to drop below target webmOutput.Video_OvershootPct = 50; // Allows bitrate to exceed target temporarily ``` ## Cross-Platform Implementation [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] For cross-platform applications, VisioForge provides the [WebMOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.Output.WebMOutput.html) class from the `VisioForge.Core.Types.X.Output` namespace, offering enhanced codec flexibility. ### Basic Setup ```csharp using VisioForge.Core.Types.X.Output; using VisioForge.Core.Types.X.VideoEncoders; using VisioForge.Core.Types.X.AudioEncoders; // Create WebM output var webmOutput = new WebMOutput("output.webm"); // Configure video encoder (VP8) webmOutput.Video = new VP8EncoderSettings(); // Configure audio encoder (Vorbis) webmOutput.Audio = new VorbisEncoderSettings(); ``` ### Integration with Video Capture SDK ```csharp // Add WebM output to Video Capture SDK var core = new VideoCaptureCoreX(); core.Outputs_Add(webmOutput, true); ``` ### Integration with Video Edit SDK ```csharp // Set WebM as output format for Video Edit SDK var core = new VideoEditCoreX(); core.Output_Format = webmOutput; ``` ### Integration with Media Blocks SDK ```csharp // Create encoders var vorbis = new VorbisEncoderSettings(); var vp9 = new VP9EncoderSettings(); // Configure WebM output block var webmSettings = new WebMSinkSettings("output.webm"); var webmOutput = new WebMOutputBlock(webmSettings, vp9, vorbis); // Add to your pipeline // pipeline.AddBlock(webmOutput); ``` ## Codec Selection Guide ### Video Codecs VisioForge SDKs support multiple video codecs for WebM: 1. **VP8** - Faster encoding speed - Lower computational requirements - Wider compatibility with older browsers - Good quality for standard video 2. **VP9** - Better compression efficiency (30-50% smaller files vs. VP8) - Higher quality at the same bitrate - Slower encoding performance - Ideal for high-resolution content 3. **AV1** - Next-generation codec with superior compression - Highest quality per bit - Significantly higher encoding complexity - Best for situations where encoding time isn't critical For codec-specific settings, refer to our dedicated documentation pages: - [VP8/VP9 Configuration](../video-encoders/vp8-vp9.md) - [AV1 Configuration](../video-encoders/av1.md) ### Audio Codecs Two primary audio codec options are available: 1. **Vorbis** - Established codec with good overall quality - Compatible with all WebM-supporting browsers - Default choice for most applications 2. **Opus** - Superior audio quality, especially at low bitrates - Better for voice content and music - Lower latency for streaming applications - More efficient for bandwidth-constrained scenarios For detailed audio settings, see: - [Vorbis Configuration](../audio-encoders/vorbis.md) - [Opus Configuration](../audio-encoders/opus.md) ## Optimization Strategies ### For Video Quality To achieve the highest possible video quality: - Use VP9 or AV1 for video encoding - Set lower quantizer values (higher quality) - Increase `LagInFrames` for better lookahead analysis - Use 2-pass encoding for offline video processing - Set higher bitrates for complex visual content ```csharp // Quality-focused VP9 configuration var vp9 = new VP9EncoderSettings { Bitrate = 3000, // Higher bitrate for better quality Speed = 0, // Slowest/highest quality encoding } ``` ### For Real-time Applications When low latency is critical: - Choose VP8 for faster encoding - Use single-pass encoding - Set `CPUUsed` to higher values - Use smaller frame lookahead buffers - Configure shorter keyframe intervals ```csharp // Low-latency VP8 configuration var vp8 = new VP8EncoderSettings { EndUsage = VP8EndUsageMode.CBR, // Constant bitrate for predictable streaming Speed = 8, // Faster encoding Deadline = VP8Deadline.Realtime, // Prioritize speed over quality ErrorResilient = true // Better recovery from packet loss }; ``` ### For File Size Efficiency To minimize storage requirements: - Use VP9 or AV1 for maximum compression - Enable two-pass encoding - Set appropriate target bitrates - Use Variable Bit Rate (VBR) encoding - Avoid unnecessary keyframes ```csharp // Storage-optimized configuration var av1 = new AV1EncoderSettings { EndUsage = AOMEndUsage.VBR, // Variable bitrate for efficiency TwoPass = true, // Enable multi-pass encoding CpuUsed = 2, // Balance between speed and compression KeyframeMaxDistance = 300 // Fewer keyframes = smaller files }; ``` ## Dependencies To implement WebM output, add the appropriate NuGet packages to your project: - For x86 platforms: [VisioForge.DotNet.Core.Redist.WebM.x86](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.WebM.x86) - For x64 platforms: [VisioForge.DotNet.Core.Redist.WebM.x64](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.WebM.x64) ## Learning Resources For additional implementation examples and more advanced scenarios, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) containing sample code for all VisioForge SDKs. ---END OF PAGE--- # Local File: .\dotnet\general\output-formats\wmv.md --- title: WMV File Output and Encoding Guide description: Learn how to implement Windows Media Video (WMV) encoding in .NET applications. Covers audio/video configuration, streaming options, profile management, and cross-platform solutions with code examples. sidebar_label: Windows Media Video --- # Windows Media Video encoders [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) This documentation covers the Windows Media Video (WMV) encoding capabilities available in VisioForge, including both Windows-specific and cross-platform solutions. ## Windows-only output [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] The [WMVOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.WMVOutput.html) class provides comprehensive Windows Media encoding capabilities for both audio and video on Windows platforms. ### Audio Encoding Features The `WMVOutput` class offers several audio-specific configuration options: - Custom audio codec selection - Audio format customization - Multiple stream modes - Bitrate control - Quality settings - Language support - Buffer size management ### Rate Control Modes WMV encoding supports four rate control modes through the `WMVStreamMode` enum: 1. CBR (Constant Bitrate) 2. VBRQuality (Variable Bitrate based on quality) 3. VBRBitrate (Variable Bitrate with target bitrate) 4. VBRPeakBitrate (Variable Bitrate with peak bitrate constraint) ### Configuration Modes The encoder can be configured in several ways using the `WMVMode` enum: - ExternalProfile: Load settings from a profile file - ExternalProfileFromText: Load settings from a text string - InternalProfile: Use built-in profiles - CustomSettings: Manual configuration - V8SystemProfile: Use Windows Media 8 system profiles ### Sample Code Create new WMV custom output configuration: ```csharp var wmvOutput = new WMVOutput { // Basic configuration Mode = WMVMode.CustomSettings, // Audio settings Custom_Audio_StreamPresent = true, Custom_Audio_Mode = WMVStreamMode.VBRQuality, Custom_Audio_Quality = 98, Custom_Audio_PeakBitrate = 192000, Custom_Audio_PeakBufferSize = 3, // Optional language setting Custom_Audio_LanguageID = "en-US" }; ``` Using an internal profile: ```csharp var profileWmvOutput = new WMVOutput { Mode = WMVMode.InternalProfile, Internal_Profile_Name = "Windows Media Video 9 for Local Network (768 kbps)" }; ``` Network streaming configuration: ```csharp var streamingWmvOutput = new WMVOutput { Mode = WMVMode.CustomSettings, Network_Streaming_WMV_Maximum_Clients = 20, Custom_Audio_Mode = WMVStreamMode.CBR }; ``` ### Custom Profile Configuration Custom profiles give you the most flexibility by allowing you to configure every aspect of the encoding process. Here are several examples for different scenarios: High-quality video streaming configuration: ```csharp var highQualityConfig = new WMVOutput { Mode = WMVMode.CustomSettings, // Video settings Custom_Video_StreamPresent = true, Custom_Video_Mode = WMVStreamMode.VBRQuality, Custom_Video_Quality = 95, Custom_Video_Width = 1920, Custom_Video_Height = 1080, Custom_Video_FrameRate = 30.0, Custom_Video_KeyFrameInterval = 4, Custom_Video_Smoothness = 80, Custom_Video_Buffer_UseDefault = false, Custom_Video_Buffer_Size = 4000, // Audio settings Custom_Audio_StreamPresent = true, Custom_Audio_Mode = WMVStreamMode.VBRQuality, Custom_Audio_Quality = 98, Custom_Audio_Format = "48kHz 16bit Stereo", Custom_Audio_PeakBitrate = 320000, Custom_Audio_PeakBufferSize = 3, // Profile metadata Custom_Profile_Name = "High Quality Streaming", Custom_Profile_Description = "1080p streaming profile with high quality audio", Custom_Profile_Language = "en-US" }; ``` Low bandwidth configuration for mobile streaming: ```csharp var mobileLowBandwidthConfig = new WMVOutput { Mode = WMVMode.CustomSettings, // Video settings optimized for mobile Custom_Video_StreamPresent = true, Custom_Video_Mode = WMVStreamMode.CBR, Custom_Video_Bitrate = 800000, // 800 kbps Custom_Video_Width = 854, Custom_Video_Height = 480, Custom_Video_FrameRate = 24.0, Custom_Video_KeyFrameInterval = 5, Custom_Video_Smoothness = 60, // Audio settings for low bandwidth Custom_Audio_StreamPresent = true, Custom_Audio_Mode = WMVStreamMode.CBR, Custom_Audio_PeakBitrate = 64000, // 64 kbps Custom_Audio_Format = "44kHz 16bit Mono", Custom_Profile_Name = "Mobile Low Bandwidth", Custom_Profile_Description = "480p optimized for mobile devices" }; ``` Audio-focused configuration for music content: ```csharp var audioFocusedConfig = new WMVOutput { Mode = WMVMode.CustomSettings, // High quality audio settings Custom_Audio_StreamPresent = true, Custom_Audio_Mode = WMVStreamMode.VBRQuality, Custom_Audio_Quality = 99, Custom_Audio_Format = "96kHz 24bit Stereo", Custom_Audio_PeakBitrate = 512000, Custom_Audio_PeakBufferSize = 4, // Minimal video settings Custom_Video_StreamPresent = true, Custom_Video_Mode = WMVStreamMode.VBRBitrate, Custom_Video_Bitrate = 500000, Custom_Video_Width = 1280, Custom_Video_Height = 720, Custom_Video_FrameRate = 25.0, Custom_Profile_Name = "Audio Focus", Custom_Profile_Description = "High quality audio configuration for music content" }; ``` ### Internal Profile Usage Internal profiles provide pre-configured settings optimized for common scenarios. Here are examples of using different internal profiles: Standard broadcast quality profile: ```csharp var broadcastProfile = new WMVOutput { Mode = WMVMode.InternalProfile, Internal_Profile_Name = "Windows Media Video 9 Advanced Profile", Custom_Video_TVSystem = WMVTVSystem.NTSC // Optional TV system override }; ``` Web streaming profile: ```csharp var webStreamingProfile = new WMVOutput { Mode = WMVMode.InternalProfile, Internal_Profile_Name = "Windows Media Video 9 for Broadband (2 Mbps)", Network_Streaming_WMV_Maximum_Clients = 100 // Optional streaming override }; ``` Low latency profile for live streaming: ```csharp var liveStreamingProfile = new WMVOutput { Mode = WMVMode.InternalProfile, Internal_Profile_Name = "Windows Media Video 9 Screen (Low Rate)", Network_Streaming_WMV_Maximum_Clients = 50 }; ``` ### External Profile Configuration External profiles allow you to load encoding settings from files or text. This is useful for sharing configurations across different projects or storing multiple configurations: Loading profile from a file: ```csharp var fileBasedProfile = new WMVOutput { Mode = WMVMode.ExternalProfile, External_Profile_FileName = @"C:\Profiles\HighQualityStreaming.prx" }; ``` Loading profile from text configuration: ```csharp var textBasedProfile = new WMVOutput { Mode = WMVMode.ExternalProfileFromText, External_Profile_Text = @" " }; ``` Saving and loading profiles programmatically: ```csharp async Task SaveAndLoadProfile(WMVOutput profile, string filename) { // Save profile configuration to JSON string jsonConfig = profile.Save(); await File.WriteAllTextAsync(filename, jsonConfig); // Load profile configuration from JSON string loadedJson = await File.ReadAllTextAsync(filename); WMVOutput loadedProfile = WMVOutput.Load(loadedJson); } ``` Example usage of profile saving/loading: ```csharp var profile = new WMVOutput { Mode = WMVMode.CustomSettings, // ... configure settings ... }; await SaveAndLoadProfile(profile, "encoding_profile.json"); ``` ### Working with Legacy Windows Media 8 Profiles For compatibility with older systems, you can use Windows Media 8 system profiles: Using Windows Media 8 profile: ```csharp var wmv8Profile = new WMVOutput { Mode = WMVMode.V8SystemProfile, V8ProfileName = "Windows Media Video 8 for Dial-up Access (28.8 Kbps)", }; ``` Customizing streaming settings for Windows Media 8 profiles: ```csharp var wmv8StreamingProfile = new WMVOutput { Mode = WMVMode.V8SystemProfile, V8ProfileName = "Windows Media Video 8 for Local Area Network (384 Kbps)", Network_Streaming_WMV_Maximum_Clients = 25, Custom_Video_TVSystem = WMVTVSystem.PAL // Optional TV system override }; ``` ### Apply settings to your core object ```csharp var core = new VideoCaptureCore(); // or VideoEditCore core.Output_Format = wmvOutput; core.Output_Filename = "output.wmv"; ``` ## Cross-platform WMV output [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] The `WMVEncoderSettings` class provides a cross-platform solution for WMV encoding using GStreamer technology. ### Features - Platform-independent implementation - Integration with GStreamer backend - Simple configuration interface - Availability checking ### Sample Code Add the WebM output to the Video Capture SDK core instance: ```csharp var wmvOutput = new WMVOutput("output.wmv"); var core = new VideoCaptureCoreX(); core.Outputs_Add(wmvOutput, true); ``` Set the output format for the Video Edit SDK core instance: ```csharp var wmvOutput = new WMVOutput("output.wmv"); var core = new VideoEditCoreX(); core.Output_Format = wmvOutput; ``` Create a Media Blocks WebM output instance: ```csharp var wma = new WMAEncoderSettings(); var wmv = new WMVEncoderSettings(); var sinkSettings = new ASFSinkSettings("output.wmv"); var webmOutput = new WMVOutputBlock(sinkSettings, wmv, wma); ``` ### Choosing Between Encoders Consider the following factors when choosing between Windows-specific `WMVOutput` and cross-platform `WMVEncoderSettings`: #### Windows-Specific WMVOutput - Pros: - Full access to Windows Media format features - Advanced rate control options - Network streaming support - Profile-based configuration - Cons: - Windows-only compatibility - Requires Windows Media components #### Cross-Platform WMV - Pros: - Platform independence - Simpler implementation - Cons: - More limited feature set - Basic configuration options only ## Best Practices 1. Always check encoder availability before use, especially with cross-platform implementations 2. Use appropriate rate control modes based on your quality and bandwidth requirements 3. Consider using internal profiles for common scenarios when using WMVOutput 4. Implement proper error handling for codec availability checks 5. Test encoding performance across different platforms when using cross-platform solutions --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\video-effects\add.md --- title: Implementing Video Effects in .NET Applications description: Master the implementation of video effects in .NET with this detailed tutorial. Learn to add, update, and configure video effect parameters in multiple SDK environments including capture, playback, and editing applications with practical C# code examples. sidebar_label: Implementing Video Effects --- # Implementing Video Effects in .NET SDK Applications Video effects can significantly enhance the visual quality and user experience of your media applications. This guide demonstrates how to properly implement and manage video effects across various .NET SDK environments. [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Implementation Overview When working with video processing in .NET applications, you'll often need to apply various effects to enhance or modify the video content. The following sections explain the process step-by-step. ## C# Code Implementation ### Example: Lightness Effect in Media Player SDK This detailed example demonstrates how to implement a lightness effect, which is a common video enhancement technique. The same implementation approach applies to Video Edit SDK .Net and Video Capture SDK .Net environments. ### Step 1: Define the Effect Interface First, you need to declare the appropriate interface for your desired effect: ```cs IVideoEffectLightness lightness; ``` ### Step 2: Retrieve or Create the Effect Instance Each effect requires a unique identifier. The following code checks if the effect already exists in the SDK control: ```cs var effect = MediaPlayer1.Video_Effects_Get("Lightness"); ``` ### Step 3: Add the Effect if Not Present If the effect doesn't exist yet, you'll need to instantiate and add it to your video processing pipeline: ```cs if (effect == null) { lightness = new VideoEffectLightness(true, 100); MediaPlayer1.Video_Effects_Add(lightness); } ``` ### Step 4: Update Existing Effect Parameters If the effect is already present, you can modify its parameters to achieve the desired visual outcome: ```cs else { lightness = effect as IVideoEffectLightness; if (lightness != null) { lightness.Value = 100; } } ``` ## Important Implementation Notes For proper functionality, ensure you enable effects processing before starting video playback or capture: * Set the `Video_Effects_Enable` property to `true` before calling any `Play()` or `Start()` methods * Effects will not be applied if this property is not enabled * Changing effect parameters during playback will update the visual output in real-time ## System Requirements To successfully implement video effects in your .NET application, you'll need: * SDK redistributable packages properly installed * Sufficient system resources for real-time video processing * Appropriate .NET framework version ## Additional Resources For more advanced implementations and examples of video effect techniques: --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) repository for additional code samples and complete projects. ---END OF PAGE--- # Local File: .\dotnet\general\video-effects\image-overlay.md --- title: Adding Image Overlays to Video Streams description: Learn how to overlay images, animated GIFs, and transparent PNGs on video streams in .NET. Step-by-step guide with code examples for implementing image overlays using different formats and transparency effects. sidebar_label: Image Overlay --- # Image overlay [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="MediaPlayerCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] ## Introduction This example demonstrates how to overlay an image on a video stream. JPG, PNG, BMP, and GIF images are supported. ## Sample code Most simple image overlay with image added from a file with custom position: ```csharp var effect = new VideoEffectImageLogo(true, "imageoverlay"); effect.Filename = @"logo.png"; effect.Left = 100; effect.Top = 100; VideoCapture1.Video_Effects_Add(effect); ``` ### Transparent image overlay SDK fully supports transparency in PNG images. If you want to set a custom transparency level, you can use the `TransparencyLevel` property with a range (0..255). ```csharp var effect = new VideoEffectImageLogo(true, "imageoverlay"); effect.Filename = @"logo.jpg"; effect.TransparencyLevel = 50; VideoCapture1.Video_Effects_Add(effect); ``` ### Animated GIF overlay You can overlay an animated GIF image on a video stream. The SDK will play the GIF animation in the overlay. ```csharp var effect = new VideoEffectImageLogo(true, "imageoverlay"); effect.Filename = @"animated.gif"; effect.Animated = true; effect.AnimationEnabled = true; VideoCapture1.Video_Effects_Add(effect); ``` ### Image overlay from `System.Drawing.Bitmap` You can overlay an image from a `System.Drawing.Bitmap` object. ```csharp var effect = new VideoEffectImageLogo(true, "imageoverlay"); effect.MemoryBitmap = new Bitmap("logo.jpg"); VideoCapture1.Video_Effects_Add(effect); ``` ### Image overlay from RGB/RGBA byte array You can overlay an image from RGB/RGBA data. ```csharp // add image logo var effect = new VideoEffectImageLogo(true, "imageoverlay"); // load image from JPG file var bitmap = new Bitmap("logo.jpg"); // lock bitmap data and save to byte data (IntPtr) var bitmapData = bitmap.LockBits(new Rectangle(0, 0, bitmap.Width, bitmap.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb); var pixels = Marshal.AllocCoTaskMem(bitmapData.Stride * bitmapData.Height); NativeAPI.CopyMemory(pixels, bitmapData.Scan0, bitmapData.Stride * bitmapData.Height); bitmap.UnlockBits(bitmapData); // set data to effect effect.Bitmap = pixels; // set bitmap properties effect.BitmapWidth = bitmap.Width; effect.BitmapHeight = bitmap.Height; effect.BitmapDepth = 3; // RGB24 // free bitmap bitmap.Dispose(); // add effect VideoCapture1.Video_Effects_Add(effect); ``` --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\video-effects\index.md --- title: Advanced Video Effects & Processing for .Net SDKs description: Enhance your applications with powerful video effects, overlays, and processing capabilities for .Net developers. Learn how to implement professional-grade visual effects, text/image overlays, and custom video processing in your .Net applications. sidebar_label: Video Effects And Processing order: 15 --- # Video Effects and Processing for .Net Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net)[!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction Our .Net SDKs provide developers with an extensive array of video effects and processing capabilities. These powerful tools enable you to transform raw video content into polished, professional-quality media. Whether you need to add dynamic overlays, apply visual effects, or perform advanced video manipulation, these SDKs deliver the functionality required for sophisticated media applications. ## Available Video Effect Categories ### Real-time Effects * Color correction and grading * Blur and sharpening filters * Noise reduction algorithms * Chroma key (green screen) processing ### Video Enhancement * Resolution upscaling * Frame rate conversion * Dynamic contrast adjustment * HDR tone mapping ## Overlay Capabilities * [Text overlay](text-overlay.md) - Add customizable text with control over font, size, color, and animation * [Image overlay](image-overlay.md) - Incorporate logos, watermarks, and graphic elements with transparency support ## Video Processing Features ### Transformation Operations * Rotation, scaling, and cropping * Picture-in-picture effects * Custom aspect ratio conversion * Video composition and layering ### Advanced Processing * Timeline-based editing capabilities * Transition effects between scenes * Audio-video synchronization tools * Performance-optimized processing pipeline * [Video sample grabber](video-sample-grabber.md) - Extract frames and process video data in real-time ## Integration Methods Our SDKs are designed for seamless integration with your .Net applications. The architecture allows for both simple implementations and advanced customizations to meet your specific project requirements. ## More Information Numerous additional video effects and processing features are available in the SDKs. Please refer to the documentation for the specific SDK you are using for detailed implementation examples and API references. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to access more code samples and implementation examples. ---END OF PAGE--- # Local File: .\dotnet\general\video-effects\text-overlay.md --- title: Advanced Text Overlays for .NET Video Processing description: Learn how to implement custom text overlays in video streams with complete control over font, size, color, position, rotation, and animation effects. Perfect for adding timestamps, captions, and dynamic text to your .NET video applications. sidebar_label: Text Overlay --- # Implementing Text Overlays in Video Streams [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="MediaPlayerCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] ## Introduction Text overlays provide a powerful way to enhance video streams with dynamic information, branding, captions, or timestamps. This guide explores how to implement fully customizable text overlays with precise control over appearance, positioning, and animations. ## Classic Engine Implementation Our classic engines (VideoCaptureCore, MediaPlayerCore, VideoEditCore) offer a straightforward API for adding text to video streams. ### Basic Text Overlay Implementation The following example demonstrates a simple text overlay with custom positioning: ```csharp var effect = new VideoEffectTextLogo(true, "textoverlay"); // set position effect.Left = 20; effect.Top = 20; // set Font (System.Drawing.Font) effect.Font = new Font("Arial", 40); // set text effect.Text = "Hello, world!"; // set text color effect.FontColor = Color.Yellow; MediaPlayer1.Video_Effects_Add(effect); ``` ### Dynamic Information Display Options #### Timestamp and Date Display You can automatically display current date, time, or video timestamp information using specialized modes: ```csharp // set mode and mask effect.Mode = TextLogoMode.DateTime; effect.DateTimeMask = "yyyy-MM-dd. hh:mm:ss"; ``` The SDK supports custom formatting masks for timestamps and dates, allowing precise control over the displayed information format. Frame number display requires no additional configuration. ### Animation and Transition Effects #### Implementing Fade Effects Create smooth text appearances and disappearances with customizable fade effects: ```csharp // add the fade-in effect.FadeIn = true; effect.FadeInDuration = TimeSpan.FromMilliseconds(5000); // add the fade-out effect.FadeOut = true; effect.FadeOutDuration = TimeSpan.FromMilliseconds(5000); ``` ### Text Rotation Options Rotate your text overlay to match your design requirements: ```csharp // set rotation mode effect.RotationMode = TextRotationMode.Rm90; ``` ### Text Flip Transformations Apply mirror effects to your text for creative presentations: ```csharp // set flip mode effect.FlipMode = TextFlipMode.XAndY; ``` ## X-Engine Implementation Our newer X-engines (VideoCaptureCoreX, MediaPlayerCoreX, VideoEditCoreX) provide an enhanced API with additional features. ### Basic X-Engine Text Overlay ```csharp // text overlay var textOverlay = new TextOverlayVideoEffect() { Text = "Hello World!" }; // set position textOverlay.XPad = 20; textOverlay.YPad = 20; textOverlay.HorizontalAlignment = TextOverlayHAlign.Left; textOverlay.VerticalAlignment = TextOverlayVAlign.Top; // set Font (System.Drawing.Font) textOverlay.Font = new FontSettings("Arial", "Bold", 24); // set text textOverlay.Text = "Hello, world!"; // set text color textOverlay.Color = SKColors.Yellow; // add the effect await videoCapture1.Video_Effects_AddOrUpdateAsync(textOverlay); ``` ### Advanced Dynamic Content Display #### Video Timestamp Integration Display the current position within the video: ```csharp // text overlay var textOverlay = new TextOverlayVideoEffect(); // set text textOverlay.Text = "Timestamp: "; // set Timestamp mode textOverlay.Mode = TextOverlayMode.Timestamp; // add the effect await videoCapture1.Video_Effects_AddOrUpdateAsync(textOverlay); ``` #### System Time Integration Show the current system time alongside your video content: ```csharp // text overlay var textOverlay = new TextOverlayVideoEffect(); // set text textOverlay.Text = "Time: "; // set System Time mode textOverlay.Mode = TextOverlayMode.SystemTime; // add the effect await videoCapture1.Video_Effects_AddOrUpdateAsync(textOverlay); ``` ## Best Practices for Text Overlays - Consider readability against different backgrounds - Use appropriate font sizes for the target display resolution - Implement fade effects for less intrusive overlays - Test performance impact with complex text effects --- For more code examples and implementation details, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\general\video-effects\video-sample-grabber.md --- title: Video sample grabber usage description: C# code sample - how to use video sample grabber in Video Capture SDK .Net, Media Player SDK .Net, Video Edit SDK .Net. sidebar_label: Video Sample Grabber Usage --- # Video sample grabber usage [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Getting RAW video frames as unmanaged memory pointer inside the structure +++ X-engines ```csharp // Subscribe to the video frame buffer event VideoCapture1.OnVideoFrameBuffer += OnVideoFrameBuffer; private void OnVideoFrameBuffer(object sender, VideoFrameXBufferEventArgs e) { // Process the VideoFrameX object ProcessFrame(e.Frame); // If you've modified the frame and want to update the video stream e.UpdateData = true; } // Example of processing a VideoFrameX frame - adjusting brightness private void ProcessFrame(VideoFrameX frame) { // Only process RGB/BGR/RGBA/BGRA formats if (frame.Format != VideoFormatX.RGB && frame.Format != VideoFormatX.BGR && frame.Format != VideoFormatX.RGBA && frame.Format != VideoFormatX.BGRA) { return; } // Get the data as a byte array for manipulation byte[] data = frame.ToArray(); // Determine the pixel size based on format int pixelSize = (frame.Format == VideoFormatX.RGB || frame.Format == VideoFormatX.BGR) ? 3 : 4; // Brightness factor (1.2 = 20% brighter, 0.8 = 20% darker) float brightnessFactor = 1.2f; // Process each pixel for (int i = 0; i < data.Length; i += pixelSize) { // Adjust R, G, B channels for (int j = 0; j < 3; j++) { int newValue = (int)(data[i + j] * brightnessFactor); data[i + j] = (byte)Math.Min(255, newValue); } } // Copy the modified data back to the frame Marshal.Copy(data, 0, frame.Data, data.Length); } ``` +++ Classic engines ```csharp // Subscribe to the video frame buffer event VideoCapture1.OnVideoFrameBuffer += OnVideoFrameBuffer; private void OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { // Process the VideoFrame structure ProcessFrame(e.Frame); // If you've modified the frame and want to update the video stream e.UpdateData = true; } // Example of processing a VideoFrame - adjusting brightness private void ProcessFrame(VideoFrame frame) { // Only process RGB format for this example if (frame.Info.Colorspace != RAWVideoColorSpace.RGB24) { return; } // Get the data as a byte array for manipulation byte[] data = frame.ToArray(); // Brightness factor (1.2 = 20% brighter, 0.8 = 20% darker) float brightnessFactor = 1.2f; // Process each pixel (RGB24 format = 3 bytes per pixel) for (int i = 0; i < data.Length; i += 3) { // Adjust R, G, B channels for (int j = 0; j < 3; j++) { int newValue = (int)(data[i + j] * brightnessFactor); data[i + j] = (byte)Math.Min(255, newValue); } } // Copy the modified data back to the frame Marshal.Copy(data, 0, frame.Data, data.Length); } ``` +++ Media Blocks SDK ```csharp // Create and set up video sample grabber block var videoSampleGrabberBlock = new VideoSampleGrabberBlock(VideoFormatX.RGB); videoSampleGrabberBlock.OnVideoFrameBuffer += OnVideoFrameBuffer; private void OnVideoFrameBuffer(object sender, VideoFrameXBufferEventArgs e) { // Process the VideoFrameX object ProcessFrame(e.Frame); // If you've modified the frame and want to update the video stream e.UpdateData = true; } // Example of processing a VideoFrameX frame - adjusting brightness private void ProcessFrame(VideoFrameX frame) { if (frame.Format != VideoFormatX.RGB) { return; } // Get the data as a byte array for manipulation byte[] data = frame.ToArray(); // Brightness factor (1.2 = 20% brighter, 0.8 = 20% darker) float brightnessFactor = 1.2f; // Process each pixel (RGB format = 3 bytes per pixel) for (int i = 0; i < data.Length; i += 3) { // Adjust R, G, B channels for (int j = 0; j < 3; j++) { int newValue = (int)(data[i + j] * brightnessFactor); data[i + j] = (byte)Math.Min(255, newValue); } } // Copy the modified data back to the frame Marshal.Copy(data, 0, frame.Data, data.Length); } ``` +++ ## Working with bitmap frames If you need to work with managed Bitmap objects instead of raw memory pointers, you can use the `OnVideoFrameBitmap` event of the `core` classes or the SampleGrabberBlock: ```csharp // Subscribe to the bitmap frame event VideoCapture1.OnVideoFrameBitmap += OnVideoFrameBitmap; private void OnVideoFrameBitmap(object sender, VideoFrameBitmapEventArgs e) { // Process the Bitmap object ProcessBitmap(e.Frame); // If you've modified the bitmap and want to update the video stream e.UpdateData = true; } // Example of processing a Bitmap - adjusting brightness private void ProcessBitmap(Bitmap bitmap) { // Use Bitmap methods or Graphics to manipulate the image // This example uses ColorMatrix for brightness adjustment // Create a graphics object from the bitmap using (Graphics g = Graphics.FromImage(bitmap)) { // Create a color matrix for brightness adjustment float brightnessFactor = 1.2f; // 1.0 = no change, >1.0 = brighter, <1.0 = darker ColorMatrix colorMatrix = new ColorMatrix(new float[][] { new float[] {brightnessFactor, 0, 0, 0, 0}, new float[] {0, brightnessFactor, 0, 0, 0}, new float[] {0, 0, brightnessFactor, 0, 0}, new float[] {0, 0, 0, 1, 0}, new float[] {0, 0, 0, 0, 1} }); // Create an ImageAttributes object and set the color matrix using (ImageAttributes attributes = new ImageAttributes()) { attributes.SetColorMatrix(colorMatrix); // Draw the image with the brightness adjustment g.DrawImage(bitmap, new Rectangle(0, 0, bitmap.Width, bitmap.Height), 0, 0, bitmap.Width, bitmap.Height, GraphicsUnit.Pixel, attributes); } } } ``` ## Working with SkiaSharp for cross-platform applications For cross-platform applications, the VideoSampleGrabberBlock provides the ability to work with SkiaSharp, a high-performance 2D graphics API for .NET. This is especially useful for applications targeting multiple platforms including mobile and web. ### Using the OnVideoFrameSKBitmap event ```csharp // First, add the SkiaSharp NuGet package to your project // Install-Package SkiaSharp // Import necessary namespaces using SkiaSharp; using VisioForge.Core.MediaBlocks.VideoProcessing; using VisioForge.Core.Types.X.Events; // Create a VideoSampleGrabberBlock with RGBA or BGRA format // Note: OnVideoFrameSKBitmap event works only with RGBA or BGRA formats var videoSampleGrabberBlock = new VideoSampleGrabberBlock(VideoFormatX.BGRA); // Enable the SaveLastFrame property if you want to take snapshots later videoSampleGrabberBlock.SaveLastFrame = true; // Subscribe to the SkiaSharp bitmap event videoSampleGrabberBlock.OnVideoFrameSKBitmap += OnVideoFrameSKBitmap; // Event handler for SkiaSharp bitmap frames private void OnVideoFrameSKBitmap(object sender, VideoFrameSKBitmapEventArgs e) { // Process the SKBitmap ProcessSKBitmap(e.Frame); // Note: Unlike VideoFrameBitmapEventArgs, VideoFrameSKBitmapEventArgs does not have // an UpdateData property as it's designed for frame viewing/analysis only } // Example of processing an SKBitmap - adjusting brightness private void ProcessSKBitmap(SKBitmap bitmap) { // Create a new bitmap to hold the processed image using (var surface = SKSurface.Create(new SKImageInfo(bitmap.Width, bitmap.Height))) { var canvas = surface.Canvas; // Set up a paint with a color filter for brightness adjustment using (var paint = new SKPaint()) { // Create a brightness filter (1.2 = 20% brighter) float brightnessFactor = 1.2f; var colorMatrix = new float[] { brightnessFactor, 0, 0, 0, 0, 0, brightnessFactor, 0, 0, 0, 0, 0, brightnessFactor, 0, 0, 0, 0, 0, 1, 0 }; paint.ColorFilter = SKColorFilter.CreateColorMatrix(colorMatrix); // Draw the original bitmap with the brightness filter applied canvas.DrawBitmap(bitmap, 0, 0, paint); // If you need to get the result as a new SKBitmap: var processedImage = surface.Snapshot(); using (var processedBitmap = SKBitmap.FromImage(processedImage)) { // Use processedBitmap for further operations or display // For example, display it in a SkiaSharp view // mySkiaView.SKBitmap = processedBitmap.Copy(); } } } } ``` ### Taking snapshots with SkiaSharp ```csharp // Create a method to capture and save a snapshot private void CaptureSnapshot(string filePath) { // Make sure SaveLastFrame was enabled on the VideoSampleGrabberBlock if (videoSampleGrabberBlock.SaveLastFrame) { // Get the last frame as an SKBitmap using (var bitmap = videoSampleGrabberBlock.GetLastFrameAsSKBitmap()) { if (bitmap != null) { // Save the bitmap to a file using (var image = SKImage.FromBitmap(bitmap)) using (var data = image.Encode(SKEncodedImageFormat.Png, 100)) using (var stream = File.OpenWrite(filePath)) { data.SaveTo(stream); } } } } } ``` ### Advantages of using SkiaSharp 1. **Cross-platform compatibility**: Works on Windows, macOS, Linux, iOS, Android, and WebAssembly 2. **Performance**: Provides high-performance graphics processing 3. **Modern API**: Offers a comprehensive set of drawing, filtering, and transformation functions 4. **Memory efficiency**: More efficient memory management compared to System.Drawing 5. **No platform dependencies**: No dependency on platform-specific imaging libraries ## Frame processing information You can get video frames from live sources or files using the `OnVideoFrameBuffer` and `OnVideoFrameBitmap` events. The `OnVideoFrameBuffer` event is faster and provides the unmanaged memory pointer for the decoded frame. The `OnVideoFrameBitmap` event is slower, but you get the decoded frame as the `Bitmap` class object. ### Understanding the frame objects - **VideoFrameX** (X-engines): Contains frame data, dimensions, format, timestamp, and methods for manipulating raw video data - **VideoFrame** (Classic engines): Similar structure but with a different memory layout - **Common properties**: - Width/Height: Frame dimensions - Format/Colorspace: Pixel format (RGB, BGR, RGBA, etc.) - Stride: Number of bytes per scan line - Timestamp: Frame's position in the video timeline - Data: Pointer to unmanaged memory with pixel data ### Important considerations 1. The frame's pixel format affects how you process the data: - RGB/BGR: 3 bytes per pixel - RGBA/BGRA/ARGB: 4 bytes per pixel (with alpha channel) - YUV formats: Different component arrangements 2. Set `e.UpdateData = true` if you've modified the frame data and want the changes to be visible in the video stream. 3. For processing that requires multiple frames or complex operations, consider using a buffer or queue to store frames. 4. When using `OnVideoFrameSKBitmap`, select either RGBA or BGRA as the frame format when creating the VideoSampleGrabberBlock. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page to get more code samples. ---END OF PAGE--- # Local File: .\dotnet\general\video-encoders\av1.md --- title: AV1 encoders usage in VisioForge .Net SDKs description: AV1 encoders usage in Video Capture SDK .Net, Video Edit SDK .Net, and Media Blocks SDK .Net sidebar_label: AV1 --- # AV1 Encoders [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] VisioForge supports multiple AV1 encoder implementations, each with its own unique features and capabilities. This document covers the available encoders and their configuration options. Currently, AV1 encoder are supported in the cross-platform engines: `VideoCaptureCoreX`, `VideoEditCoreX`, and `Media Blocks SDK`. ## Available Encoders 1. [AMD AMF AV1 Encoder (AMF)](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.VideoEncoders.AMFAV1EncoderSettings.html) 2. [NVIDIA NVENC AV1 Encoder (NVENC)](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.VideoEncoders.NVENCAV1EncoderSettings.html) 3. [Intel QuickSync AV1 Encoder (QSV)](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.VideoEncoders.QSVAV1EncoderSettings.html) 4. [AOM AV1 Encoder](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.VideoEncoders.AOMAV1EncoderSettings.html) 5. [RAV1E Encoder](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.VideoEncoders.RAV1EEncoderSettings.html) You can use AV1 encoder with [WebM output](../output-formats/webm.md) or for network streaming. ## AMD AMF AV1 Encoder The AMD AMF AV1 encoder provides hardware-accelerated encoding using AMD graphics cards. ### Features - Multiple quality presets - Variable bitrate control modes - GOP size control - QP (Quantization Parameter) control - Smart Access Video support ### Rate Control Modes - `Default`: Depends on Usage - `CQP`: Constant QP - `LCVBR`: Latency Constrained VBR - `VBR`: Peak Constrained VBR - `CBR`: Constant Bitrate ### Sample Usage ```csharp var encoderSettings = new AMFAV1EncoderSettings { Bitrate = 3000, // 3 Mbps GOPSize = 30, // GOP size of 30 frames Preset = AMFAV1EncoderPreset.Quality, // Quality preset RateControl = AMFAV1RateControlMode.VBR, // Variable Bitrate mode Usage = AMFAV1EncoderUsage.Transcoding, // Transcoding usage MaxBitrate = 5000, // 5 Mbps max bitrate QpI = 26, // I-frame QP QpP = 26, // P-frame QP RefFrames = 1, // Number of reference frames SmartAccessVideo = false // Smart Access Video disabled }; ``` ## NVIDIA NVENC AV1 Encoder NVIDIA's NVENC AV1 encoder provides hardware-accelerated encoding using NVIDIA GPUs. ### Features - Multiple encoding presets - Adaptive B-frame support - Temporal AQ (Adaptive Quantization) - VBV (Video Buffering Verifier) buffer control - Spatial AQ support ### Rate Control Modes - `Default`: Default mode - `ConstQP`: Constant Quantization Parameter - `CBR`: Constant Bitrate - `VBR`: Variable Bitrate - `CBR_LD_HQ`: Low-delay CBR, high quality - `CBR_HQ`: CBR, high quality (slower) - `VBR_HQ`: VBR, high quality (slower) ### Sample Usage ```csharp var encoderSettings = new NVENCAV1EncoderSettings { Bitrate = 3000, // 3 Mbps Preset = NVENCPreset.HighQuality, // High quality preset RateControl = NVENCRateControl.VBR, // Variable Bitrate mode GOPSize = 75, // GOP size of 75 frames MaxBitrate = 5000, // 5 Mbps max bitrate BFrames = 2, // 2 B-frames between I and P RCLookahead = 8, // 8 frames lookahead TemporalAQ = true, // Enable temporal AQ Tune = NVENCTune.HighQuality, // High quality tuning VBVBufferSize = 6000 // 6000k VBV buffer }; ``` ## Intel QuickSync AV1 Encoder Intel's QuickSync AV1 encoder provides hardware-accelerated encoding using Intel GPUs. ### Features - Low latency mode support - Configurable target usage - Reference frame control - Flexible GOP size settings ### Rate Control Modes - `CBR`: Constant Bitrate - `VBR`: Variable Bitrate - `CQP`: Constant Quantizer ### Sample Usage ```csharp var encoderSettings = new QSVAV1EncoderSettings { Bitrate = 2000, // 2 Mbps LowLatency = false, // Standard latency mode TargetUsage = 4, // Balanced quality/speed GOPSize = 30, // GOP size of 30 frames MaxBitrate = 4000, // 4 Mbps max bitrate QPI = 26, // I-frame QP QPP = 28, // P-frame QP RateControl = QSVAV1EncRateControl.VBR, // Variable Bitrate mode RefFrames = 1 // Number of reference frames }; ``` ## AOM AV1 Encoder The Alliance for Open Media (AOM) AV1 encoder is a software-based reference implementation. ### Features - Buffer control settings - CPU usage optimization - Frame dropping support - Multi-threading capabilities - Super-resolution support ### Rate Control Modes - `VBR`: Variable Bit Rate Mode - `CBR`: Constant Bit Rate Mode - `CQ`: Constrained Quality Mode - `Q`: Constant Quality Mode ### Sample Usage ```csharp var encoderSettings = new AOMAV1EncoderSettings { BufferInitialSize = TimeSpan.FromMilliseconds(4000), BufferOptimalSize = TimeSpan.FromMilliseconds(5000), BufferSize = TimeSpan.FromMilliseconds(6000), CPUUsed = 4, // CPU usage level DropFrame = 0, // Disable frame dropping RateControl = AOMAV1EncoderEndUsageMode.VBR, // Variable Bitrate mode TargetBitrate = 256, // 256 Kbps Threads = 0, // Auto thread count UseRowMT = true, // Enable row-based threading SuperResMode = AOMAV1SuperResolutionMode.None // No super-resolution }; ``` ## RAV1E Encoder RAV1E is a fast and safe AV1 encoder written in Rust. ### Features - Speed preset control - Quantizer settings - Key frame interval control - Low latency mode - Psychovisual tuning ### Sample Usage ```csharp var encoderSettings = new RAV1EEncoderSettings { Bitrate = 3000, // 3 Mbps LowLatency = false, // Standard latency mode MaxKeyFrameInterval = 240, // Maximum keyframe interval MinKeyFrameInterval = 12, // Minimum keyframe interval MinQuantizer = 0, // Minimum quantizer value Quantizer = 100, // Base quantizer value SpeedPreset = 6, // Speed preset (0-10) Tune = RAV1EEncoderTune.Psychovisual // Psychovisual tuning }; ``` ## General Usage Notes 1. All encoders implement the `IAV1EncoderSettings` interface, providing a consistent way to create encoder blocks. 2. Each encoder has its own specific set of optimizations and trade-offs. 3. Hardware encoders (AMF, NVENC, QSV) generally provide better performance but may have specific hardware requirements. 4. Software encoders (AOM, RAV1E) offer more flexibility but may require more CPU resources. ## Recommendations - For AMD GPUs: Use AMF encoder - For NVIDIA GPUs: Use NVENC encoder - For Intel GPUs: Use QSV encoder - For maximum quality: Use AOM encoder - For CPU-efficient encoding: Use RAV1E encoder ## Best Practices 1. Always check encoder availability before using it 2. Set appropriate bitrates based on your target resolution and framerate 3. Use appropriate GOP sizes based on your content type 4. Consider the trade-off between quality and encoding speed 5. Test different rate control modes to find the best fit for your use case ---END OF PAGE--- # Local File: .\dotnet\general\video-encoders\h264.md --- title: H264 encoders usage in VisioForge .Net SDKs description: H264 encoders usage in Video Capture SDK .Net, Video Edit SDK .Net, and Media Blocks SDK .Net sidebar_label: H264 --- # H264 Encoders [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] This document provides detailed information about available H264 encoders, their features, rate control options, and usage examples. For Windows-only engines check the [MP4 output](../output-formats/mp4.md) page. ## Overview The following H264 encoders are available: 1. AMD AMF H264 Encoder (GPU-accelerated) 2. NVIDIA NVENC H264 Encoder (GPU-accelerated) 3. Intel QSV H264 Encoder (GPU-accelerated) 4. OpenH264 Encoder (Software) 5. Apple Media H264 Encoder (Hardware-accelerated for Apple devices) 6. VAAPI H264 Encoder (Linux hardware acceleration) 7. Various OMX-based encoders (Platform-specific) ## AMD AMF H264 Encoder AMD's Advanced Media Framework (AMF) provides hardware-accelerated encoding on AMD GPUs. ### Key Features - Hardware-accelerated encoding - Multiple preset options (Balanced, Speed, Quality) - Configurable GOP size - CABAC entropy coding support - Various rate control methods ### Rate Control Options ```csharp public enum AMFH264EncoderRateControl { Default = -1, // Default, depends on usage CQP = 0, // Constant QP CBR = 1, // Constant bitrate VBR = 2, // Peak constrained VBR LCVBR = 3 // Latency Constrained VBR } ``` ### Sample Usage ```csharp var settings = new AMFH264EncoderSettings { Bitrate = 5000, // 5 Mbps CABAC = true, RateControl = AMFH264EncoderRateControl.CBR, Preset = AMFH264EncoderPreset.Quality, Profile = AMFH264EncoderProfile.Main, Level = AMFH264EncoderLevel.Level4_2, GOPSize = 30 }; var encoder = new H264EncoderBlock(settings); ``` ## NVIDIA NVENC H264 Encoder NVIDIA's hardware-based video encoder provides efficient H264 encoding on NVIDIA GPUs. ### Key Features - Hardware-accelerated encoding - B-frame support - Adaptive quantization - Multiple reference frames - Weighted prediction - Look-ahead support ### Rate Control Options Inherited from NVENCBaseEncoderSettings with additional H264-specific options: - Constant Bitrate (CBR) - Variable Bitrate (VBR) - Constant QP (CQP) - Quality-based VBR ### Sample Usage ```csharp var settings = new NVENCH264EncoderSettings { Bitrate = 5000, MaxBitrate = 8000, RCLookahead = 20, BFrames = 2, Profile = NVENCH264Profile.High, Level = NVENCH264Level.Level4_2, TemporalAQ = true }; var encoder = new H264EncoderBlock(settings); ``` ## Intel Quick Sync Video (QSV) H264 Encoder Intel's hardware-based video encoder available on Intel processors with integrated graphics. ### Key Features - Hardware-accelerated encoding - Low latency mode - Multiple rate control methods - B-frame support - Intelligent rate control options ### Rate Control Options ```csharp public enum QSVH264EncRateControl { CBR = 1, // Constant Bitrate VBR = 2, // Variable Bitrate CQP = 3, // Constant Quantizer AVBR = 4, // Average Variable Bitrate LA_VBR = 8, // Look Ahead VBR ICQ = 9, // Intelligent CQP VCM = 10, // Video Conferencing Mode LA_ICQ = 11, // Look Ahead ICQ LA_HRD = 13, // HRD compliant LA QVBR = 14 // Quality-defined VBR } ``` ### Sample Usage ```csharp var settings = new QSVH264EncoderSettings { Bitrate = 5000, MaxBitrate = 8000, RateControl = QSVH264EncRateControl.VBR, Profile = QSVH264EncProfile.High, Level = QSVH264EncLevel.Level4_2, LowLatency = true, BFrames = 2 }; var encoder = new H264EncoderBlock(settings); ``` ## OpenH264 Encoder Cisco's open-source H264 software encoder. ### Key Features - Software-based encoding - Multiple complexity levels - Scene change detection - Adaptive quantization - Denoising support ### Rate Control Options ```csharp public enum OpenH264RCMode { Quality = 0, // Quality mode Bitrate = 1, // Bitrate mode Buffer = 2, // Buffer based Off = -1 // Rate control off } ``` ### Sample Usage ```csharp var settings = new OpenH264EncoderSettings { Bitrate = 5000, RateControl = OpenH264RCMode.Bitrate, Profile = OpenH264Profile.Main, Level = OpenH264Level.Level4_2, Complexity = OpenH264Complexity.Medium, EnableDenoise = true, SceneChangeDetection = true }; var encoder = new H264EncoderBlock(settings); ``` ## Apple Media H264 Encoder Hardware-accelerated encoder for Apple platforms. ### Key Features - Hardware acceleration on Apple devices - Real-time encoding support - Frame reordering options - Quality-based encoding ### Sample Usage ```csharp var settings = new AppleMediaH264EncoderSettings { Bitrate = 5000, AllowFrameReordering = true, Quality = 0.8, Realtime = true }; var encoder = new H264EncoderBlock(settings); ``` ## VAAPI H264 Encoder Video Acceleration API encoder for Linux systems. ### Key Features - Hardware acceleration on Linux - Multiple profile support - Trellis quantization - B-frame support - Various rate control methods ### Rate Control Options ```csharp public enum VAAPIH264RateControl { CQP = 1, // Constant QP CBR = 2, // Constant bitrate VBR = 4, // Variable bitrate VBRConstrained = 5, // Constrained VBR ICQ = 7, // Intelligent CQP QVBR = 8 // Quality-defined VBR } ``` ### Sample Usage ```csharp var settings = new VAAPIH264EncoderSettings { Bitrate = 5000, RateControl = VAAPIH264RateControl.CBR, Profile = VAAPIH264EncoderProfile.Main, MaxBFrames = 2, Trellis = true, CABAC = true }; var encoder = new H264EncoderBlock(settings); ``` ## OpenMAX (OMX) H264 Encoders Guide OpenMAX (OMX) is a royalty-free cross-platform API that provides comprehensive streaming media codec and application portability by enabling accelerated multimedia components to be developed, integrated and programmatically accessed across multiple operating systems and silicon platforms. ### OMX Google H264 Encoder This is a baseline implementation primarily targeted at Android platforms. ```csharp var settings = new OMXGoogleH264EncoderSettings(); // Configure via Properties dictionary settings.Properties["some_key"] = "value"; settings.ParseStream = true; // Enable stream parsing (disable for SRT) ``` Key characteristics: - Generic implementation - Suitable for most Android devices - Configurable through properties dictionary - Minimal direct parameter exposure for maximum compatibility ### OMX Qualcomm H264 Encoder Optimized for Qualcomm Snapdragon platforms, this encoder leverages hardware acceleration capabilities. ```csharp var settings = new OMXQualcommH264EncoderSettings { Bitrate = 6_000, // 6 Mbps IFrameInterval = 2, // Keyframe every 2 seconds ParseStream = true // Enable stream parsing }; ``` Key features: - Direct bitrate control - I-frame interval management - Hardware acceleration on Qualcomm platforms - Additional properties available through dictionary ### OMX Exynos H264 Encoder Specifically designed for Samsung Exynos platforms: ```csharp var settings = new OMXExynosH264EncoderSettings(); // Configure platform-specific options settings.Properties["quality_level"] = "high"; settings.Properties["hardware_acceleration"] = "true"; ``` Characteristics: - Samsung hardware optimization - Flexible configuration through properties - Hardware acceleration support - Platform-specific optimizations ### OMX SPRD H264 Encoder Designed for Spreadtrum (UNISOC) platforms: ```csharp var settings = new OMXSPRDH264EncoderSettings { Bitrate = 6_000, // Target bitrate IFrameInterval = 2, // GOP size in seconds ParseStream = true // Stream parsing flag }; ``` Features: - Hardware acceleration for SPRD chips - Direct bitrate control - Keyframe interval management - Additional platform-specific properties ## Common Properties and Usage All OMX encoders share some common characteristics: ```csharp // Common interface implementation public interface IH264EncoderSettings { bool ParseStream { get; set; } KeyFrameDetectedDelegate KeyFrameDetected { get; set; } H264EncoderType GetEncoderType(); MediaBlock CreateBlock(); } ``` Properties dictionary usage: ```csharp // Generic way to set platform-specific options settings.Properties["hardware_acceleration"] = "true"; settings.Properties["quality_preset"] = "balanced"; settings.Properties["thread_count"] = "4"; ``` ## Best Practices 1. **Encoder Selection** - Use hardware encoders (AMD, NVIDIA, Intel) when available for better performance - Fall back to OpenH264 when hardware encoding is not available - Use platform-specific encoders (Apple Media, VAAPI) when targeting specific platforms 2. **Rate Control Selection** - Use CBR for streaming applications where consistent bitrate is important - Use VBR for offline encoding where quality is more important than bitrate consistency - Use CQP for highest quality when bitrate is not a concern - Consider using look-ahead options for better quality when latency is not critical 3. **Performance Optimization** - Adjust GOP size based on content type (smaller for high motion, larger for static content) - Enable CABAC for better compression efficiency when latency is not critical - Use appropriate profile and level for target devices - Consider B-frames for better compression but be aware of latency impact 4. **Platform Detection**: ```csharp if (OMXSPRDH264EncoderSettings.IsAvailable()) { // Use SPRD encoder } else if (OMXQualcommH264EncoderSettings.IsAvailable()) { // Fall back to Qualcomm } else { // Fall back to Google implementation } ``` ## Platform-Specific Considerations 1. **Qualcomm Platforms**: - Best performance with native bitrate settings - Optimal for streaming when I-frame interval is 2-3 seconds - Hardware acceleration should be enabled when possible 2. **Exynos Platforms**: - Properties dictionary offers more fine-grained control - Consider using platform-specific quality presets - Monitor hardware acceleration status 3. **SPRD Platforms**: - Keep bitrate within platform capabilities - Use I-frame interval appropriate for content type - Consider memory constraints when setting properties 4. **General OMX**: - Always test on target hardware - Monitor encoder performance metrics - Have fallback options ready - Consider power consumption impact ---END OF PAGE--- # Local File: .\dotnet\general\video-encoders\hevc.md --- title: HEVC Encoding with VisioForge .Net SDKs description: Learn how to implement hardware HEVC encoding with AMD, NVIDIA, and Intel GPUs in your .NET applications sidebar_label: HEVC --- # HEVC Hardware Encoding in .NET Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] This guide explores hardware-accelerated HEVC (H.265) encoding options available in VisioForge .NET SDKs. We'll cover implementation details for AMD, NVIDIA, and Intel GPU encoders, helping you choose the right solution for your video processing needs. For Windows-specific output formats, refer to our [MP4 output documentation](../output-formats/mp4.md). ## Hardware HEVC Encoders Overview Modern GPUs offer powerful hardware encoding capabilities that significantly outperform software-based solutions. VisioForge SDKs support three major hardware HEVC encoders: - **AMD AMF** - For AMD Radeon GPUs - **NVIDIA NVENC** - For NVIDIA GeForce and professional GPUs - **Intel QuickSync** - For Intel CPUs with integrated graphics Each encoder provides unique features and optimization options. Let's explore their capabilities and implementation details. ## AMD AMF HEVC Encoder AMD's Advanced Media Framework (AMF) delivers hardware-accelerated HEVC encoding on compatible Radeon GPUs. It balances encoding speed, quality, and efficiency for various scenarios. ### Key Features and Settings - **Rate Control Methods**: - `CQP` (Constant QP) for fixed quality settings - `LCVBR` (Latency Constrained VBR) for streaming - `VBR` (Variable Bitrate) for offline encoding - `CBR` (Constant Bitrate) for reliable bandwidth usage - **Usage Profiles**: - Transcoding (highest quality) - Ultra Low Latency (for real-time applications) - Low Latency (for interactive streaming) - Web Camera (optimized for webcam sources) - **Quality Presets**: Balance between encoding speed and output quality ### Implementation Example ```csharp var encoder = new AMFHEVCEncoderSettings { Bitrate = 3000, // 3 Mbps target bitrate MaxBitrate = 5000, // 5 Mbps peak bitrate RateControl = AMFHEVCEncoderRateControl.CBR, // Quality optimization Preset = AMFHEVCEncoderPreset.Quality, Usage = AMFHEVCEncoderUsage.Transcoding, // GOP and frame settings GOPSize = 30, // Keyframe interval QP_I = 22, // I-frame quantization parameter QP_P = 22, // P-frame quantization parameter RefFrames = 1 // Reference frames count }; ``` ## NVIDIA NVENC HEVC Encoder NVIDIA's NVENC technology provides dedicated encoding hardware on GeForce and professional GPUs, offering excellent performance and quality across various bitrates. ### Key Capabilities - **Multiple Profile Support**: - Main (8-bit) - Main10 (10-bit HDR) - Main444 (high color precision) - Extended bit depth options (12-bit) - **Advanced Encoding Features**: - B-frame support with adaptive placement - Temporal Adaptive Quantization - Weighted Prediction - Look-ahead rate control - **Performance Presets**: From quality-focused to ultra-fast encoding ### Implementation Example ```csharp var encoder = new NVENCHEVCEncoderSettings { // Bitrate configuration Bitrate = 3000, // 3 Mbps target MaxBitrate = 5000, // 5 Mbps maximum // Profile settings Profile = NVENCHEVCProfile.Main, Level = NVENCHEVCLevel.Level5_1, // Quality enhancement options BFrames = 2, // Number of B-frames BAdaptive = true, // Adaptive B-frame placement TemporalAQ = true, // Temporal adaptive quantization WeightedPrediction = true, // Improves quality for fades RCLookahead = 20, // Frames to analyze for rate control // Buffer settings VBVBufferSize = 0 // Use default buffer size }; ``` ## Intel QuickSync HEVC Encoder Intel QuickSync leverages the integrated GPU present in modern Intel processors for efficient hardware encoding, making it accessible without a dedicated graphics card. ### Key Features - **Versatile Rate Control Options**: - `CBR` (Constant Bitrate) - `VBR` (Variable Bitrate) - `CQP` (Constant Quantizer) - `ICQ` (Intelligent Constant Quality) - `VCM` (Video Conferencing Mode) - `QVBR` (Quality-defined VBR) - **Optimization Settings**: - Target Usage parameter (quality vs speed balance) - Low-latency mode for streaming - HDR conformance controls - Closed caption insertion options - **Profile Support**: - Main (8-bit) - Main10 (10-bit HDR) ### Implementation Example ```csharp var encoder = new QSVHEVCEncoderSettings { // Bitrate settings Bitrate = 3000, // 3 Mbps target MaxBitrate = 5000, // 5 Mbps peak RateControl = QSVHEVCEncRateControl.VBR, // Quality tuning TargetUsage = 4, // 1=Best quality, 7=Fastest encoding // Stream structure GOPSize = 30, // Keyframe interval RefFrames = 2, // Reference frames // Feature configuration Profile = QSVHEVCEncProfile.Main, LowLatency = false, // Enable for streaming // Advanced options CCInsertMode = QSVHEVCEncSEIInsertMode.Insert, DisableHRDConformance = false }; ``` ## Quality Presets for Simplified Configuration All encoders support standardized quality presets through the `VideoQuality` enum, providing a simplified configuration approach: - **Low**: 1 Mbps target, 2 Mbps max (for basic streaming) - **Normal**: 3 Mbps target, 5 Mbps max (for standard content) - **High**: 6 Mbps target, 10 Mbps max (for detailed content) - **Very High**: 15 Mbps target, 25 Mbps max (for premium quality) ### Using Quality Presets ```csharp // For AMD AMF var amfEncoder = new AMFHEVCEncoderSettings(VideoQuality.High); // For NVIDIA NVENC var nvencEncoder = new NVENCHEVCEncoderSettings(VideoQuality.High); // For Intel QuickSync var qsvEncoder = new QSVHEVCEncoderSettings(VideoQuality.High); ``` ## Hardware Detection and Fallback Strategy A robust implementation should check for encoder availability and implement appropriate fallbacks: ```csharp // Create the most appropriate encoder for the current system IHEVCEncoderSettings GetOptimalHEVCEncoder() { if (AMFHEVCEncoderSettings.IsAvailable()) { return new AMFHEVCEncoderSettings(VideoQuality.High); } else if (NVENCHEVCEncoderSettings.IsAvailable()) { return new NVENCHEVCEncoderSettings(VideoQuality.High); } else if (QSVHEVCEncoderSettings.IsAvailable()) { return new QSVHEVCEncoderSettings(VideoQuality.High); } else { // Fall back to software encoder if no hardware is available return new SoftwareHEVCEncoderSettings(VideoQuality.High); } } ``` ## Best Practices for HEVC Encoding ### 1. Encoder Selection - **AMD GPUs**: Best for applications where you know users have AMD hardware - **NVIDIA GPUs**: Provides consistent quality across generations, ideal for professional applications - **Intel QuickSync**: Great universal option when a dedicated GPU isn't guaranteed ### 2. Rate Control Selection - **Streaming**: Use CBR for consistent bandwidth utilization - **VoD Content**: VBR provides better quality at the same file size - **Archival**: CQP ensures consistent quality regardless of content complexity ### 3. Performance Optimization - Lower the reference frames count for faster encoding - Adjust GOP size based on content type (smaller for high motion, larger for static scenes) - Consider disabling B-frames for ultra-low latency applications ### 4. Quality Enhancement - Enable adaptive quantization features for content with varying complexity - Use weighted prediction for content with fades or gradual transitions - Implement look-ahead when encoding quality is more important than latency ## Common Troubleshooting 1. **Encoder unavailability**: Ensure GPU drivers are up-to-date 2. **Lower than expected quality**: Check if quality presets match your content type 3. **Performance issues**: Monitor GPU utilization and adjust settings accordingly 4. **Compatibility problems**: Verify target devices support the selected HEVC profile ## Conclusion Hardware-accelerated HEVC encoding offers significant performance advantages for .NET applications dealing with video processing. By leveraging AMD AMF, NVIDIA NVENC, or Intel QuickSync through VisioForge SDKs, you can achieve optimal balance between quality, speed, and efficiency. Choose the right encoder and settings based on your specific requirements, target audience, and content type to deliver the best possible experience in your applications. Start by detecting available hardware encoders, implementing appropriate quality settings, and testing across various content types to ensure optimal results. ---END OF PAGE--- # Local File: .\dotnet\general\video-encoders\index.md --- title: Complete Guide to Video Encoders in VisioForge .NET SDKs description: Detailed overview of video encoders for .NET developers using Video Capture, Video Edit, and Media Blocks SDKs - features, performance, and implementation sidebar_label: Video Encoders order: 19 --- # Video Encoders in VisioForge .NET SDKs [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Video Encoders Video encoders are essential components in multimedia processing applications, responsible for compressing video data while maintaining optimal quality. VisioForge .NET SDKs incorporate multiple advanced encoders to meet diverse development requirements across different platforms and use cases. This guide provides detailed information about each encoder's capabilities, performance characteristics, and implementation details to help .NET developers make informed decisions for their multimedia applications. ## Hardware vs. Software Encoding When developing video processing applications, choosing between hardware and software encoders significantly impacts application performance and user experience. ### Hardware-Accelerated Encoders Hardware encoders utilize dedicated processing units (GPUs or specialized hardware): - **Advantages**: Lower CPU usage, higher encoding speeds, improved battery efficiency - **Use cases**: Real-time streaming, live video processing, mobile applications - **Examples in our SDK**: NVIDIA NVENC, AMD AMF, Intel QuickSync ### Software Encoders Software encoders run on the CPU without specialized hardware: - **Advantages**: Greater compatibility, more quality control options, platform independence - **Use cases**: High-quality offline encoding, environments without compatible hardware - **Examples in our SDK**: OpenH264, Software MJPEG encoder ## Available Video Encoders Our SDKs provide extensive encoder options to accommodate various project requirements: ### H.264 (AVC) Encoders H.264 remains one of the most widely used video codecs, offering excellent compression efficiency and broad compatibility. #### Key Features: - Multiple profile support (Baseline, Main, High) - Adjustable bitrate controls (CBR, VBR, CQP) - B-frame and reference frame configuration - Hardware acceleration options from major vendors [Learn more about H.264 encoders →](h264.md) ### HEVC (H.265) Encoders HEVC delivers superior compression efficiency compared to H.264, enabling higher quality video at the same bitrate or comparable quality at lower bitrates. #### Key Features: - Approximately 50% better compression than H.264 - 8-bit and 10-bit color depth support - Multiple hardware acceleration options - Advanced rate control mechanisms [Learn more about HEVC encoders →](hevc.md) ### AV1 Encoder AV1 represents the next generation of video codecs, offering superior compression efficiency particularly suited for web streaming. #### Key Features: - Royalty-free open standard - Better compression than HEVC - Increasing browser and device support - Optimized for web content delivery [Learn more about AV1 encoder →](av1.md) ### MJPEG Encoders Motion JPEG provides frame-by-frame JPEG compression, useful for specific applications where individual frame access is important. #### Key Features: - Simple implementation - Low encoding latency - Independent frame access - Hardware and software implementations [Learn more about MJPEG encoders →](mjpeg.md) ### VP8 and VP9 Encoders These open codecs developed by Google offer royalty-free alternatives with good compression efficiency. #### Key Features: - Open-source implementation - Competitive quality-to-bitrate ratio - Wide web browser support - Suitable for WebM container format [Learn more about VP8/VP9 encoders →](vp8-vp9.md) ### Windows Media Video Encoder The WMV encoder provides compatibility with Windows ecosystem and legacy applications. #### Key Features: - Native Windows integration - Multiple profile options - Compatible with Windows Media framework - Efficient for Windows-centric deployments [Learn more about WMV encoder →](../output-formats/wmv.md) ## Encoder Selection Guidelines Selecting the optimal encoder depends on various factors: ### Platform Compatibility - **Windows**: All encoders supported - **macOS**: Apple Media encoders, OpenH264, AV1 - **Linux**: VAAPI, OpenH264, software implementations ### Hardware Requirements When using hardware-accelerated encoders, verify system compatibility: ```csharp // Check availability of hardware encoders if (NVENCEncoderSettings.IsAvailable()) { // Use NVIDIA encoder } else if (AMFEncoderSettings.IsAvailable()) { // Use AMD encoder } else if (QSVEncoderSettings.IsAvailable()) { // Use Intel encoder } else { // Fallback to software encoder } ``` ### Quality vs. Performance Tradeoffs Different encoders offer varying balances between quality and encoding speed: | Encoder Type | Quality | Performance | CPU Usage | |--------------|---------|-------------|-----------| | NVENC H.264 | Good | Excellent | Very Low | | NVENC HEVC | Very Good | Very Good | Very Low | | AMF H.264 | Good | Very Good | Very Low | | QSV H.264 | Good | Excellent | Very Low | | OpenH264 | Good-Excellent | Moderate | High | | AV1 | Excellent | Poor-Moderate | Very High | ### Encoding Scenarios - **Live streaming**: Prefer hardware encoders with CBR rate control - **Video recording**: Hardware encoders with VBR for better quality/size balance - **Offline processing**: Quality-focused encoders with VBR or CQP - **Low-latency applications**: Hardware encoders with low-latency presets ## Performance Optimization Maximize encoder efficiency with these best practices: 1. **Match output resolution to content requirements** - Avoid unnecessary upscaling 2. **Select appropriate bitrates** - Higher isn't always better; target your delivery medium 3. **Choose encoder presets wisely** - Faster presets use less CPU but may reduce quality 4. **Enable scene detection** for improved quality at scene changes 5. **Use hardware acceleration** when available for real-time applications ## Conclusion VisioForge .NET SDKs provide a comprehensive set of video encoders to meet diverse requirements across different platforms and use cases. By understanding the strengths and configurations of each encoder, developers can create high-performance video applications with optimal quality and efficiency. For specific encoder configuration details, refer to the dedicated documentation pages for each encoder type linked throughout this guide. ---END OF PAGE--- # Local File: .\dotnet\general\video-encoders\mjpeg.md --- title: Motion JPEG (MJPEG) Encoders in VisioForge .NET SDKs description: Complete guide to implementing MJPEG video encoders in .NET applications using VisioForge SDKs, with CPU and GPU acceleration options sidebar_label: Motion JPEG --- # Motion JPEG (MJPEG) Video Encoders for .NET Applications [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] ## Introduction to MJPEG Encoding in VisioForge The VisioForge .NET SDK suite provides robust Motion JPEG (MJPEG) encoder implementations designed for efficient video processing in your applications. MJPEG remains a popular choice for many video applications due to its simplicity, compatibility, and specific use cases where frame-by-frame compression is advantageous. This documentation provides a detailed exploration of the two MJPEG encoder options available in the VisioForge library: 1. CPU-based MJPEG encoder - The default implementation utilizing processor resources 2. GPU-accelerated Intel QuickSync MJPEG encoder - Hardware-accelerated option for compatible systems Both implementations offer developers flexible configuration options while maintaining the core MJPEG functionality through the unified `IMJPEGEncoderSettings` interface. ## What is MJPEG and Why Use It? Motion JPEG (MJPEG) is a video compression format where each video frame is compressed separately as a JPEG image. Unlike more modern codecs such as H.264 or H.265 that use temporal compression across frames, MJPEG treats each frame independently. ### Key Advantages of MJPEG - **Frame-by-frame processing**: Each frame maintains independent quality without temporal artifacts - **Lower latency**: Minimal processing delay makes it suitable for real-time applications - **Editing friendly**: Individual frame access simplifies non-linear editing workflows - **Resilience to motion**: Maintains quality during scenes with significant movement - **Universal compatibility**: Works across platforms without specialized hardware decoders - **Simplified development**: Straightforward implementation in various programming environments ### Common Use Cases MJPEG encoding is particularly valuable in scenarios such as: - **Security and surveillance systems**: Where frame quality and reliability are critical - **Video capture applications**: Real-time video recording with minimal latency - **Medical imaging**: When individual frame fidelity is essential - **Industrial vision systems**: For consistent frame-by-frame analysis - **Multimedia editing software**: Where rapid seeking and frame extraction is required - **Streaming in bandwidth-limited environments**: Where consistent quality is preferred over file size ## MJPEG Implementation in VisioForge Both MJPEG encoder implementations in VisioForge SDKs derive from the `IMJPEGEncoderSettings` interface, ensuring a consistent approach regardless of which encoder you choose. This design allows for easy switching between implementations based on performance requirements and hardware availability. ### Core Interface and Common Properties The shared interface exposes essential properties and methods: - **Quality**: Integer value from 10-100 controlling compression level - **CreateBlock()**: Factory method to generate the encoder processing block - **IsAvailable()**: Static method to verify encoder support on the current system ## CPU-based MJPEG Encoder The CPU-based encoder serves as the default implementation, providing reliable encoding across virtually all system configurations. It performs all encoding operations using the CPU, making it a universally compatible choice for MJPEG encoding. ### Features and Specifications - **Processing method**: Pure CPU-based encoding - **Quality range**: 10-100 (higher values = better quality, larger files) - **Default quality**: 85 (balances quality and file size) - **Performance characteristics**: Scales with CPU cores and processing power - **Memory usage**: Moderate, dependent on frame resolution and processing settings - **Compatibility**: Works on any system supporting the .NET runtime - **Specialized hardware**: None required ### Detailed Implementation Example ```csharp // Import the necessary VisioForge namespaces using VisioForge.Core.Types.Output; // Create a new instance of the CPU-based encoder settings var mjpegSettings = new MJPEGEncoderSettings(); // Configure quality (10-100) mjpegSettings.Quality = 85; // Default balanced quality // Optional: Verify encoder availability if (MJPEGEncoderSettings.IsAvailable()) { // Create the encoder processing block var encoderBlock = mjpegSettings.CreateBlock(); // Add the encoder block to your processing pipeline pipeline.AddBlock(encoderBlock); // Additional pipeline configuration // ... // Start the encoding process await pipeline.StartAsync(); } else { // Handle encoder unavailability Console.WriteLine("CPU-based MJPEG encoder is not available on this system."); } ``` ### Quality-to-Size Relationship The quality setting directly affects both the visual quality and resulting file size: | Quality Setting | Visual Quality | File Size | Recommended Use Case | |----------------|---------------|-----------|----------------------| | 10-30 | Very Low | Smallest | Archival, minimal bandwidth | | 31-60 | Low | Small | Web previews, thumbnails | | 61-80 | Medium | Moderate | Standard recording | | 81-95 | High | Large | Professional applications | | 96-100 | Maximum | Largest | Critical visual analysis | ## Intel QuickSync MJPEG Encoder For systems with compatible Intel hardware, the QuickSync MJPEG encoder offers GPU-accelerated encoding performance. This implementation leverages Intel's QuickSync Video technology to offload encoding operations from the CPU to dedicated media processing hardware. ### Hardware Requirements - Intel CPU with integrated graphics supporting QuickSync Video - Supported processor families: - Intel Core i3/i5/i7/i9 (6th generation or newer recommended) - Intel Xeon with compatible graphics - Select Intel Pentium and Celeron processors with HD Graphics ### Features and Advantages - **Hardware acceleration**: Dedicated media processing engines - **Quality range**: 10-100 (same as CPU-based encoder) - **Default quality**: 85 - **Preset profiles**: Four predefined quality configurations - **Reduced CPU load**: Frees processor resources for other tasks - **Power efficiency**: Lower energy consumption during encoding - **Performance gain**: Up to 3x faster than CPU-based encoding (hardware dependent) ### Implementation Examples #### Basic Implementation ```csharp // Import required namespaces using VisioForge.Core.Types.Output; // Create QuickSync MJPEG encoder with default settings var qsvEncoder = new QSVMJPEGEncoderSettings(); // Verify hardware support if (QSVMJPEGEncoderSettings.IsAvailable()) { // Set custom quality value qsvEncoder.Quality = 90; // Higher quality setting // Create and add encoder block var encoderBlock = qsvEncoder.CreateBlock(); pipeline.AddBlock(encoderBlock); // Continue pipeline setup } else { // Fall back to CPU-based encoder Console.WriteLine("QuickSync hardware not detected. Falling back to CPU encoder."); var cpuEncoder = new MJPEGEncoderSettings(); pipeline.AddBlock(cpuEncoder.CreateBlock()); } ``` #### Using Preset Quality Profiles ```csharp // Create encoder with preset quality profile var highQualityEncoder = new QSVMJPEGEncoderSettings(VideoQuality.High); // Or select other preset profiles var lowQualityEncoder = new QSVMJPEGEncoderSettings(VideoQuality.Low); var normalQualityEncoder = new QSVMJPEGEncoderSettings(VideoQuality.Normal); var veryHighQualityEncoder = new QSVMJPEGEncoderSettings(VideoQuality.VeryHigh); // Check availability and create encoder block if (QSVMJPEGEncoderSettings.IsAvailable()) { var encoderBlock = highQualityEncoder.CreateBlock(); // Use encoder in pipeline } ``` ### Quality Preset Mapping The QuickSync implementation provides convenient preset quality profiles that map to specific quality values: | Preset Profile | Quality Value | Suitable Applications | |---------------|--------------|----------------------| | Low | 60 | Surveillance, monitoring, archiving | | Normal | 75 | Standard recording, web content | | High | 85 | Default for most applications | | VeryHigh | 95 | Professional video production | ## Performance Optimization Guidelines Achieving optimal MJPEG encoding performance requires careful consideration of several factors: ### System Configuration Recommendations 1. **Memory allocation**: Ensure sufficient RAM for frame buffering (minimum 8GB recommended) 2. **Storage throughput**: Use SSD storage for best write performance during encoding 3. **CPU considerations**: Multi-core processors benefit the CPU-based encoder 4. **GPU drivers**: Keep Intel graphics drivers updated for QuickSync performance 5. **Background processes**: Minimize competing system processes during encoding ### Code-Level Optimization Techniques 1. **Frame size selection**: Consider downscaling before encoding for better performance 2. **Quality selection**: Balance visual requirements against performance needs 3. **Pipeline design**: Minimize unnecessary processing stages before encoding 4. **Error handling**: Implement graceful fallback between encoder types 5. **Threading model**: Respect the threading model of the VisioForge pipeline ## Best Practices for MJPEG Implementation To ensure reliable and efficient MJPEG encoding in your applications: 1. **Always check availability**: Use the `IsAvailable()` method before creating encoder instances 2. **Implement encoder fallback**: Have CPU-based encoding as a backup when QuickSync is unavailable 3. **Quality testing**: Test different quality settings with your specific video content 4. **Performance monitoring**: Monitor CPU/GPU usage during encoding to identify bottlenecks 5. **Exception handling**: Handle potential encoder initialization failures gracefully 6. **Version compatibility**: Ensure SDK version compatibility with your development environment 7. **License validation**: Verify proper licensing for your production environment ## Troubleshooting Common Issues ### QuickSync Availability Problems - Ensure Intel drivers are up-to-date - Verify BIOS settings haven't disabled integrated graphics - Check for competing GPU-accelerated applications ### Performance Issues - Monitor system resource usage during encoding - Reduce input frame resolution or frame rate if necessary - Consider quality setting adjustments ### Quality Problems - Increase quality settings for better visual results - Examine source material for pre-existing quality issues - Consider frame pre-processing for problematic source material ## Conclusion The VisioForge .NET SDK provides flexible MJPEG encoding options suitable for a wide range of development scenarios. By understanding the characteristics and configuration options of both the CPU-based and QuickSync implementations, developers can make informed decisions about which encoder best fits their application requirements. Whether prioritizing universal compatibility with the CPU-based encoder or leveraging hardware acceleration with the QuickSync implementation, the consistent interface and comprehensive feature set enable efficient video processing while maintaining the frame-independent nature of MJPEG encoding that makes it valuable for specific video processing applications. ---END OF PAGE--- # Local File: .\dotnet\general\video-encoders\vp8-vp9.md --- title: Implementing VP8 and VP9 Encoders in VisioForge .Net SDK description: Learn how to configure VP8 and VP9 video encoders in VisioForge SDK for optimal streaming, recording and processing performance sidebar_label: VP8/VP9 --- # VP8 and VP9 Video Encoders Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) This guide shows you how to implement VP8 and VP9 video encoding in VisioForge .NET SDKs. You'll learn about the available encoder options and how to optimize them for your specific application needs. ## Encoder Options Overview VisioForge SDK provides multiple encoder implementations based on your platform requirements: ### Windows Platform Encoders [!badge variant="dark" size="xl" text="VideoCaptureCore"] [!badge variant="dark" size="xl" text="VideoEditCore"] - Software-based VP8 and VP9 encoders configured through the [WebMOutput](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.Output.WebMOutput.html) class ### Cross-Platform X-Engine Options [!badge variant="dark" size="xl" text="VideoCaptureCoreX"] [!badge variant="dark" size="xl" text="VideoEditCoreX"] [!badge variant="dark" size="xl" text="MediaBlocksPipeline"] - VP8 software encoder via [VP8EncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.VideoEncoders.VP8EncoderSettings.html) - VP9 software encoder via [VP9EncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.VideoEncoders.VP9EncoderSettings.html) - Hardware-accelerated Intel GPU VP9 encoder via [QSVVP9EncoderSettings](https://api.visioforge.org/dotnet/api/VisioForge.Core.Types.X.VideoEncoders.QSVVP9EncoderSettings.html) for integrated GPUs ## Bitrate Control Strategies All VP8 and VP9 encoders support different bitrate control modes to match your application requirements: ### Constant Bitrate (CBR) CBR maintains consistent bitrate throughout the encoding process, making it ideal for: - Live streaming applications - Scenarios with bandwidth limitations - Real-time video communication **Implementation Examples:** With `WebMOutput` (Windows): ```csharp var webmOutput = new WebMOutput(); webmOutput.Video_EndUsage = VP8EndUsageMode.CBR; webmOutput.Video_Encoder = WebMVideoEncoder.VP8; webmOutput.Video_Bitrate = 2000; // 2 Mbps ``` With `VP8EncoderSettings`: ```csharp var vp8 = new VP8EncoderSettings(); vp8.RateControl = VPXRateControl.CBR; vp8.TargetBitrate = 2000; // 2 Mbps ``` With `VP9EncoderSettings`: ```csharp var vp9 = new VP9EncoderSettings(); vp9.RateControl = VPXRateControl.CBR; vp9.TargetBitrate = 2000; // 2 Mbps ``` With Intel GPU encoder: ```csharp var vp9qsv = new QSVVP9EncoderSettings(); vp9qsv.RateControl = QSVVP9EncRateControl.CBR; vp9qsv.Bitrate = 2000; // 2 Mbps ``` ### Variable Bitrate (VBR) VBR dynamically adjusts bitrate based on content complexity, best for: - Non-live video encoding - Scenarios prioritizing visual quality over file size - Content with varying visual complexity **Implementation Examples:** With `WebMOutput` (Windows): ```csharp var webmOutput = new WebMOutput(); webmOutput.Video_EndUsage = VP8EndUsageMode.VBR; webmOutput.Video_Encoder = WebMVideoEncoder.VP8; webmOutput.Video_Bitrate = 3000; // 3 Mbps target ``` With `VP8EncoderSettings`: ```csharp var vp8 = new VP8EncoderSettings(); vp8.RateControl = VPXRateControl.VBR; vp8.TargetBitrate = 3000; ``` With `VP9EncoderSettings`: ```csharp var vp9 = new VP9EncoderSettings(); vp9.RateControl = VPXRateControl.VBR; vp9.TargetBitrate = 3000; ``` With Intel GPU encoder: ```csharp var vp9qsv = new QSVVP9EncoderSettings(); vp9qsv.RateControl = QSVVP9EncRateControl.VBR; vp9qsv.Bitrate = 3000; ``` ## Quality-Focused Encoding Modes These modes prioritize consistent visual quality over specific bitrate targets: ### Constant Quality (CQ) Mode Available for software VP8 and VP9 encoders: ```csharp var vp8 = new VP8EncoderSettings(); vp8.RateControl = VPXRateControl.CQ; vp8.CQLevel = 20; // Quality level (0-63, lower values = better quality) ``` ```csharp var vp9 = new VP9EncoderSettings(); vp9.RateControl = VPXRateControl.CQ; vp9.CQLevel = 20; ``` ### Intel QSV Quality Modes Intel's hardware encoder supports two quality-focused modes: **Intelligent Constant Quality (ICQ):** ```csharp var vp9qsv = new QSVVP9EncoderSettings(); vp9qsv.RateControl = QSVVP9EncRateControl.ICQ; vp9qsv.ICQQuality = 25; // 20-27 recommended for balanced quality ``` **Constant Quantization Parameter (CQP):** ```csharp var vp9qsv = new QSVVP9EncoderSettings(); vp9qsv.RateControl = QSVVP9EncRateControl.CQP; vp9qsv.QPI = 26; // I-frame QP vp9qsv.QPP = 28; // P-frame QP ``` ## VP9 Performance Optimization VP9 encoders offer additional features for enhanced performance: ### Adaptive Quantization Improves visual quality by allocating more bits to complex areas: ```csharp var vp9 = new VP9EncoderSettings(); vp9.AQMode = VPXAdaptiveQuantizationMode.Variance; // Enable variance-based AQ ``` ### Parallel Processing Speeds up encoding through multi-threading and tile-based processing: ```csharp var vp9 = new VP9EncoderSettings(); vp9.FrameParallelDecoding = true; // Enable parallel frame processing vp9.RowMultithread = true; // Enable row-based multithreading vp9.TileColumns = 6; // Set number of tile columns (log2) vp9.TileRows = 0; // Set number of tile rows (log2) ``` ## Error Resilience Settings Both VP8 and VP9 support error resilience for robust streaming over unreliable networks: Using `WebMOutput` (Windows): ```csharp var webmOutput = new WebMOutput(); webmOutput.Video_ErrorResilient = true; // Enable error resilience ``` Using software encoders: ```csharp var vpx = new VP8EncoderSettings(); // or VP9EncoderSettings vpx.ErrorResilient = VPXErrorResilientFlags.Default | VPXErrorResilientFlags.Partitions; ``` ## Performance Tuning Options Optimize encoding performance with these settings: ```csharp var vpx = new VP8EncoderSettings(); // or VP9EncoderSettings vpx.CPUUsed = 0; // Range: -16 to 16, higher values favor speed over quality vpx.NumOfThreads = 4; // Specify number of encoding threads vpx.TokenPartitions = VPXTokenPartitions.Eight; // Enable parallel token processing ``` ## Best Practices for VP8/VP9 Encoding ### Rate Control Selection Choose the appropriate rate control mode based on your application: - **CBR** for live streaming and real-time communication - **VBR** for offline encoding where quality is the priority - **Quality-based modes** (CQ, ICQ, CQP) for highest possible quality regardless of bitrate ### Performance Optimization - Adjust `CPUUsed` to balance quality and encoding speed - Enable multithreading for faster encoding on multi-core systems - Use tile-based parallelism in VP9 for better hardware utilization ### Error Recovery - Enable error resilience when streaming over unreliable networks - Configure token partitioning for improved error recovery - Consider frame reordering limitations for low-latency applications ### Quality Optimization - Use adaptive quantization in VP9 for better quality distribution - Consider two-pass encoding for offline encoding scenarios - Adjust quantizer settings based on content type and target quality By following this guide, you'll be able to effectively implement and configure VP8 and VP9 encoders in your VisioForge .NET applications for optimal performance and quality. ---END OF PAGE--- # Local File: .\dotnet\install\avalonia.md --- title: Integrate Media SDKs with Avalonia Applications description: Learn how to implement powerful video and media capabilities in cross-platform Avalonia projects. This guide covers setup, configuration, and optimization across Windows, macOS, Linux, Android, and iOS platforms, with platform-specific requirements and best practices for seamless integration. sidebar_label: Avalonia order: 14 --- # Building Media-Rich Avalonia Applications with VisioForge ## Framework Overview Avalonia UI stands out as a versatile, truly cross-platform .NET UI framework with support spanning desktop environments (Windows, macOS, Linux) and mobile platforms (iOS and Android). VisioForge enhances this ecosystem through the specialized `VisioForge.DotNet.Core.UI.Avalonia` package, which delivers high-performance multimedia controls tailored for Avalonia's architecture. Our suite of SDKs empowers Avalonia developers with extensive multimedia capabilities: [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Setup and Configuration ### Essential Package Installation Creating an Avalonia application with VisioForge multimedia capabilities requires installing several key NuGet components: 1. Avalonia-specific UI layer: `VisioForge.DotNet.Core.UI.Avalonia` 2. Core functionality package: `VisioForge.DotNet.Core` (or specialized SDK variant) 3. Platform-specific native bindings (covered in detail in later sections) Add these to your project manifest (`.csproj`): ```xml ``` ### Avalonia Initialization Architecture A key advantage of VisioForge's Avalonia integration is its seamless initialization model. Unlike some frameworks requiring explicit global setup, the Avalonia controls become available immediately once the core package is referenced. Your standard Avalonia bootstrap code in `Program.cs` remains unchanged: ```csharp using Avalonia; using System; namespace YourAppNamespace; class Program { [STAThread] public static void Main(string[] args) => BuildAvaloniaApp() .StartWithClassicDesktopLifetime(args); public static AppBuilder BuildAvaloniaApp() => AppBuilder.Configure() .UsePlatformDetect() .LogToTrace(); } ``` ### Implementing the VideoView Component The `VideoView` control serves as the central rendering element. Integrate it into your `.axaml` files using: 1. First, declare the VisioForge namespace: ```xml xmlns:vf="clr-namespace:VisioForge.Core.UI.Avalonia;assembly=VisioForge.Core.UI.Avalonia" ``` 2. Then, implement the control in your layout structure: ```xml ``` This control adapts automatically to the platform-specific rendering pipeline while maintaining a consistent API surface. ## Desktop Platform Integration ### Windows Implementation Guide Windows deployment requires specific native components packaged as NuGet references. #### Core Windows Components Add the following Windows-specific packages to your desktop project: ```xml ``` #### Advanced Media Format Support For extended codec compatibility, include the size-optimized UPX variant of the libAV libraries: ```xml ``` The UPX variant delivers significant size optimization while maintaining full codec compatibility. ### macOS Integration For macOS deployment: #### Native Binding Package Include the macOS-specific native components: ```xml ``` #### Framework Configuration Configure your project with the appropriate macOS framework target: ```xml net8.0-macos14.0 Exe ``` ### Linux Deployment Linux support includes: #### Framework Configuration Set up the appropriate target framework for Linux environments: ```xml net8.0 Exe ``` #### System Dependencies For Linux deployment, ensure required system libraries are available on the target system. Unlike Windows and macOS which use NuGet packages, Linux may require system-level dependencies. Consult the VisioForge Linux documentation for specific platform requirements. ## Mobile Development ### Android Configuration Android implementation requires additional steps unique to Avalonia's Android integration model: #### Java Interoperability Layer The VisioForge Android implementation requires a binding bridge between .NET and Android native APIs: 1. Obtain the Java binding project from the [VisioForge samples repository](https://github.com/visioforge/.Net-SDK-s-samples) in the `AndroidDependency` directory 2. Add the appropriate binding project to your solution: - Use `VisioForge.Core.Android.X8.csproj` for .NET 8 applications 3. Reference this project in your Android head project: ```xml ``` #### Android-Specific Package Add the Android redistributable package: ```xml ``` #### Runtime Permissions Configure the `AndroidManifest.xml` with appropriate permissions: - `android.permission.CAMERA` - `android.permission.RECORD_AUDIO` - `android.permission.READ_EXTERNAL_STORAGE` - `android.permission.WRITE_EXTERNAL_STORAGE` - `android.permission.INTERNET` ### iOS Development iOS integration with Avalonia requires: #### Native Components Add the iOS-specific redistributable to your iOS head project: ```xml ``` #### Important Implementation Notes - Physical device testing is essential, as simulator support is limited - Update your `Info.plist` with privacy descriptions: - `NSCameraUsageDescription` for camera access - `NSMicrophoneUsageDescription` for audio recording ## Performance Engineering Maximize application performance with these Avalonia-specific optimizations: 1. Enable hardware acceleration when supported by the underlying platform 2. Implement adaptive resolution scaling based on device capabilities 3. Optimize memory usage patterns, especially for mobile targets 4. Utilize Avalonia's compositing model effectively by minimizing visual tree complexity around the `VideoView` ## Troubleshooting Guide ### Media Format Problems - **Playback failures**: - Ensure all platform packages are correctly referenced - Verify codec availability for the target media format - Check for platform-specific format restrictions ### Performance Concerns - **Slow playback or rendering**: - Enable hardware acceleration where available - Reduce processing resolution when appropriate - Utilize Avalonia's threading model correctly ### Deployment Challenges - **Platform-specific runtime errors**: - Validate target framework specifications - Verify native dependency availability - Ensure proper provisioning for mobile targets ## Multi-Platform Project Architecture VisioForge's Avalonia integration excels with a specialized multi-headed project structure. The `SimplePlayerMVVM` sample demonstrates this architecture: - **Core shared project** (`SimplePlayerMVVM.csproj`): Contains cross-platform views, view models, and shared logic with conditional multi-targeting: ```xml enable latest true net8.0-android;net8.0-ios;net8.0-windows net8.0-android;net8.0-ios;net8.0-macos14.0 net8.0-android;net8.0 ``` - **Platform-specific head projects**: - `SimplePlayerMVVM.Android.csproj`: Contains Android-specific configuration and binding references - `SimplePlayerMVVM.iOS.csproj`: Handles iOS initialization and dependencies - `SimplePlayerMVVM.Desktop.csproj`: Manages desktop platform detection and appropriate redistributable loading For simpler desktop-only applications, `SimpleVideoCaptureA.csproj` provides a streamlined model with platform detection occurring within a single project file. ## Conclusion VisioForge's Avalonia integration offers a sophisticated approach to cross-platform multimedia development that leverages Avalonia's unique architectural advantages. Through carefully structured platform-specific components and a unified API, developers can build rich media applications that span desktop and mobile platforms without compromising on performance or capabilities. For complete code examples and sample applications, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples), which contains specialized Avalonia demonstrations in the Video Capture SDK X and Media Player SDK X sections. ---END OF PAGE--- # Local File: .\dotnet\install\index.md --- title: .NET SDKs Installation Guide for Developers description: Complete guide for installing multimedia .NET SDKs in Visual Studio, Rider, and other IDEs. Learn step-by-step installation methods, platform-specific configuration, framework support, and troubleshooting for Windows, macOS, iOS, Android, and Linux environments. sidebar_label: Installation order: 21 --- # VisioForge .NET SDKs Installation Guide [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) VisioForge offers powerful multimedia SDKs for .NET developers that enable advanced video capture, editing, playback, and media processing capabilities in your applications. This guide covers everything you need to know to properly install and configure our SDKs in your development environment. ## Available .NET SDKs VisioForge provides several specialized SDKs to address different multimedia needs: - [Video Capture SDK .Net](https://www.visioforge.com/video-capture-sdk-net) - For capturing video from cameras, screen recording, and streaming - [Video Edit SDK .Net](https://www.visioforge.com/video-edit-sdk-net) - For video editing, processing, and format conversion - [Media Blocks SDK .Net](https://www.visioforge.com/media-blocks-sdk-net) - For building custom media processing pipelines - [Media Player SDK .Net](https://www.visioforge.com/media-player-sdk-net) - For creating custom media players with advanced features ## Installation Methods You can install our SDKs using two primary methods: ### Using Setup Files The setup file installation method is recommended for Windows development environments. This approach: 1. Automatically installs all required dependencies 2. Configures Visual Studio integration 3. Includes sample projects to help you get started quickly 4. Provides documentation and additional resources Setup files can be downloaded from the respective SDK product pages on our website. ### Using NuGet Packages For cross-platform development or CI/CD pipelines, our NuGet packages offer flexibility and easy integration: ```cmd Install-Package VisioForge.DotNet.Core ``` Additional UI-specific packages may be required depending on your target platform: ```cmd Install-Package VisioForge.DotNet.Core.UI.MAUI Install-Package VisioForge.DotNet.Core.UI.WinUI Install-Package VisioForge.DotNet.Core.UI.Avalonia ``` ## IDE Integration and Setup Our SDKs seamlessly integrate with popular .NET development environments: ### Visual Studio Integration [Visual Studio](visual-studio.md) offers the most complete experience with our SDKs: - Full IntelliSense support for SDK components - Built-in debugging for media processing components - Designer support for visual controls - NuGet package management For detailed Visual Studio setup instructions, see our [Visual Studio integration guide](visual-studio.md). ### JetBrains Rider Integration [Rider](rider.md) provides excellent cross-platform development support: - Full code completion for SDK APIs - Smart navigation features for exploring SDK classes - Integrated NuGet package management - Cross-platform debugging capabilities For Rider-specific instructions, visit our [Rider integration documentation](rider.md). ### Visual Studio for Mac [Visual Studio for Mac](visual-studio-mac.md) users can develop applications for macOS, iOS, and Android: - Built-in NuGet package manager for installing SDK components - Project templates for quick setup - Integrated debugging tools Learn more in our [Visual Studio for Mac setup guide](visual-studio-mac.md). ## Platform-Specific Configuration ### Target Framework Configuration Each operating system requires specific target framework settings for optimal compatibility: #### Windows Applications Windows applications must use the `-windows` target framework suffix: ```xml net8.0-windows ``` This enables access to Windows-specific APIs and UI frameworks like WPF and Windows Forms. #### Android Development Android projects require the `-android` framework suffix: ```xml net8.0-android ``` Ensure that Android workloads are installed in your development environment: ``` dotnet workload install android ``` #### iOS Development iOS applications must use the `-ios` target framework: ```xml net8.0-ios ``` iOS development requires a Mac with Xcode installed, even when using Visual Studio on Windows. #### macOS Applications macOS native applications use either the `-macos` or `-maccatalyst` framework: ```xml net8.0-macos ``` For .NET MAUI applications targeting macOS, use: ```xml net8.0-maccatalyst ``` #### Linux Development Linux applications use the standard target framework without a platform suffix: ```xml net8.0 ``` Ensure required .NET workloads are installed: ``` dotnet workload install linux ``` ## Special Framework Support ### .NET MAUI Applications [MAUI projects](maui.md) require special configuration: - Add the `VisioForge.DotNet.Core.UI.MAUI` NuGet package - Configure platform-specific permissions in your project - Use MAUI-specific video view controls See our [detailed MAUI guide](maui.md) for complete instructions. ### Avalonia UI Framework [Avalonia projects](avalonia.md) provide a cross-platform UI alternative: - Install the `VisioForge.DotNet.Core.UI.Avalonia` package - Use Avalonia-specific video rendering controls - Configure platform-specific dependencies Our [Avalonia integration guide](avalonia.md) provides complete setup instructions. ## SDK Initialization for Cross-Platform Engines Our SDKs include both Windows-specific DirectShow engines (like `VideoCaptureCore`) and cross-platform X-engines (like `VideoCaptureCoreX`). The X-engines require explicit initialization and cleanup. ### Initializing the SDK Before using any X-engine components, initialize the SDK: ```csharp // Initialize at application startup VisioForge.Core.VisioForgeX.InitSDK(); // Or use the async version await VisioForge.Core.VisioForgeX.InitSDKAsync(); ``` ### Cleaning Up Resources When your application exits, properly release resources: ```csharp // Clean up at application exit VisioForge.Core.VisioForgeX.DestroySDK(); // Or use the async version await VisioForge.Core.VisioForgeX.DestroySDKAsync(); ``` Failing to initialize or clean up properly may result in memory leaks or unstable behavior. ## Video Rendering Controls Each UI framework requires specific video view controls to display media content: ### Windows Forms ```csharp // Add reference to VisioForge.DotNet.Core using VisioForge.Core.UI.WinForms; // In your form videoView = new VideoView(); this.Controls.Add(videoView); ``` ### WPF Applications ```csharp // Add reference to VisioForge.DotNet.Core using VisioForge.Core.UI.WPF; // In your XAML ``` ### MAUI Applications ```csharp // Add reference to VisioForge.DotNet.Core.UI.MAUI using VisioForge.Core.UI.MAUI; // In your XAML ``` ### Avalonia UI ```csharp // Add reference to VisioForge.DotNet.Core.UI.Avalonia using VisioForge.Core.UI.Avalonia; // In your XAML ``` ## Native Dependencies Management Our SDKs leverage native libraries for optimal performance. These dependencies must be properly managed for deployment: - Windows: Included automatically with setup installation or NuGet packages - macOS/iOS: Bundled with NuGet packages but require proper app signing - Android: Included in NuGet packages with proper architecture support - Linux: May require additional system packages depending on distribution For detailed deployment instructions, see our [deployment guide](../deployment-x/index.md). ## Troubleshooting Common Installation Issues If you encounter issues during installation: 1. Verify target framework compatibility with your project type 2. Ensure all required workloads are installed (`dotnet workload list`) 3. Check for dependency conflicts in your project 4. Confirm proper SDK initialization for X-engines 5. Review platform-specific requirements in our documentation ## Sample Code and Resources We maintain an extensive collection of sample applications on our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) to help you get started quickly with our SDKs. These examples cover common scenarios like: - Video capture from cameras and screens - Media playback with custom controls - Video editing and processing - Cross-platform development Visit our repository for the latest code examples and best practices for using our SDKs. --- For additional support or questions, please contact our technical support team or visit our documentation portal. ---END OF PAGE--- # Local File: .\dotnet\install\maui.md --- title: Integrate Media SDKs with .NET MAUI Applications description: Learn how to implement powerful video and media capabilities in cross-platform .NET MAUI projects. This guide covers setup, configuration, and optimization across Windows, Android, iOS, and macOS platforms, with platform-specific requirements and best practices for seamless integration. sidebar_label: MAUI order: 15 --- # Integrating VisioForge SDKs with .NET MAUI Applications ## Overview .NET Multi-platform App UI (MAUI) enables developers to build cross-platform applications for mobile and desktop from a single codebase. VisioForge provides comprehensive support for MAUI applications through the `VisioForge.Core.UI.MAUI` package, which contains specialized UI controls designed specifically for the .NET MAUI platform. Our SDKs enable powerful multimedia capabilities across all MAUI-supported platforms: [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Getting Started ### Installation To begin using VisioForge with your MAUI project, install the required NuGet packages: 1. The core UI package: `VisioForge.Core.UI.MAUI` 2. Platform-specific redistributable (detailed in platform sections below) ### SDK Initialization Proper initialization is essential for the VisioForge SDKs to function correctly within your MAUI application. This process must be completed in your `MauiProgram.cs` file. ```csharp using SkiaSharp.Views.Maui.Controls.Hosting; using VisioForge.Core.UI.MAUI; public static class MauiProgram { public static MauiApp CreateMauiApp() { var builder = MauiApp.CreateBuilder(); builder .UseMauiApp() // Initialize the SkiaSharp package by adding the below line of code .UseSkiaSharp() // Initialize the VisioForge MAUI package by adding the below line of code .ConfigureMauiHandlers(handlers => handlers.AddVisioForgeHandlers()) // After initializing the VisioForge MAUI package, optionally add additional fonts .ConfigureFonts(fonts => { fonts.AddFont("OpenSans-Regular.ttf", "OpenSansRegular"); fonts.AddFont("OpenSans-Semibold.ttf", "OpenSansSemibold"); }); // Continue initializing your .NET MAUI App here return builder.Build(); } } ``` ## Using VisioForge Controls in XAML The `VideoView` control is the primary interface for displaying video content in your MAUI application. To use VisioForge controls in your XAML files: 1. Add the VisioForge namespace to your XAML file: ```xaml xmlns:vf="clr-namespace:VisioForge.Core.UI.MAUI;assembly=VisioForge.Core.UI.MAUI" ``` 2. Add the VideoView control to your layout: ```xaml ``` The VideoView control adapts to the native rendering capabilities of each platform while providing a consistent API for your application code. ## Platform-Specific Configuration ### Android Implementation Android requires additional configuration steps to ensure proper operation: #### 1. Add Java Bindings Library The VisioForge SDK relies on native Android functionality that requires a custom Java bindings library: 1. Clone the binding library from our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/AndroidDependency) 2. Add the appropriate project to your solution: - Use `VisioForge.Core.Android.X8.csproj` for .NET 8 - Use `VisioForge.Core.Android.X9.csproj` for .NET 9 3. Add the reference to your project file: ```xml ``` #### 2. Add Android Redistributable Package Include the Android-specific redistributable package: ```xml ``` #### 3. Android Permissions Ensure your AndroidManifest.xml includes the necessary permissions for camera, microphone, and storage access depending on your application's functionality. Common required permissions include: - `android.permission.CAMERA` - `android.permission.RECORD_AUDIO` - `android.permission.READ_EXTERNAL_STORAGE` - `android.permission.WRITE_EXTERNAL_STORAGE` ### iOS Configuration iOS integration requires fewer steps but has some important considerations: #### 1. Add iOS Redistributable Add the iOS-specific package to your project: ```xml ``` #### 2. Important Notes for iOS Development - **Use physical devices**: The SDK requires testing on physical iOS devices rather than simulators for full functionality. - **Privacy descriptions**: Add the necessary usage description strings in your Info.plist file for camera and microphone access: - `NSCameraUsageDescription` - `NSMicrophoneUsageDescription` ### macOS Configuration For macOS Catalyst applications: #### 1. Configure Runtime Identifiers To ensure your application works correctly on both Intel and Apple Silicon Macs, specify the appropriate runtime identifiers: ```xml maccatalyst-x64 maccatalyst-arm64 ``` #### 2. Enable Trimming For optimal performance on macOS, enable the PublishTrimmed option: ```xml true ``` For more detailed information about macOS deployment, refer to our [macOS](../deployment-x/macOS.md) documentation page. ### Windows Configuration For Windows applications, you need to include several redistributable packages: #### 1. Add Base Windows Redistributables Include the core Windows packages: ```xml ``` #### 2. Add Extended Codec Support (Optional but Recommended) For enhanced media format support, include the libAV (FFMPEG) package: ```xml ``` ### Performance Optimization For optimal performance across platforms: 1. Use hardware acceleration when available 2. Adjust video resolution based on the target device capabilities 3. Consider memory constraints on mobile devices when processing large media files ## Troubleshooting Common Issues - **Blank video display**: Ensure proper permissions are granted on mobile platforms - **Missing codecs**: Verify all platform-specific redistributable packages are correctly installed - **Performance issues**: Check that hardware acceleration is enabled when available - **Deployment errors**: Confirm runtime identifiers are correctly specified for the target platform ## Conclusion The VisioForge SDK provides a comprehensive solution for adding powerful multimedia capabilities to your .NET MAUI applications. By following the platform-specific setup instructions and best practices outlined in this guide, you can create rich cross-platform applications with advanced video and audio features. For additional examples and sample code, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\install\rider.md --- title: Integrate .Net SDKs into JetBrains Rider | Tutorial description: Learn how to integrate .Net SDKs with JetBrains Rider in this step-by-step tutorial. From project setup to adding NuGet packages, UI components, and platform dependencies - master cross-platform development with WPF, MAUI, WinUI, and Avalonia integration for Windows, macOS, iOS and Android apps. sidebar_label: JetBrains Rider order: 12 --- # .Net SDKs Integration with JetBrains Rider ## Introduction [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) This comprehensive guide walks you through the process of installing and configuring VisioForge .Net SDKs within JetBrains Rider, a powerful cross-platform IDE for .NET development. While we'll use a Windows application with WPF as our primary example, these installation steps can be readily adapted for macOS, iOS, or Android applications as well. JetBrains Rider provides a consistent development experience across Windows, macOS, and Linux platforms, making it an excellent choice for cross-platform .NET development. ## Creating Your Project ### Setting Up a Modern Project Structure Begin by launching JetBrains Rider and creating a new project. For this tutorial, we'll use WPF (Windows Presentation Foundation) as our framework. It's crucial to utilize the modern project format, which provides enhanced compatibility with VisioForge SDKs and offers a more streamlined development experience. 1. Open JetBrains Rider 2. Select "Create New Solution" from the welcome screen 3. Choose "WPF Application" from the available templates 4. Configure your project settings, ensuring you select the modern project format 5. Click "Create" to generate your project structure ![Project creation screen in Rider](rider1.png) ## Adding Required NuGet Packages ### Installing the Main SDK Package Each VisioForge SDK has a corresponding main package that provides core functionality. You'll need to select the appropriate package based on which SDK you're working with. 1. Right-click on your project in the Solution Explorer 2. Select the "Manage NuGet Packages" menu item 3. In the NuGet Package Manager, search for the VisioForge package that corresponds to your desired SDK 4. Select the latest stable version and click "Install" ![Adding the main SDK package through NuGet](rider2.png) ### Available Main SDK Packages Choose from the following main packages based on your development needs: - [VisioForge.DotNet.VideoCapture](https://www.nuget.org/packages/VisioForge.DotNet.VideoCapture) - For applications requiring video capture functionality - [VisioForge.DotNet.VideoEdit](https://www.nuget.org/packages/VisioForge.DotNet.VideoEdit) - For video editing and processing applications - [VisioForge.DotNet.MediaPlayer](https://www.nuget.org/packages/VisioForge.DotNet.MediaPlayer) - For media playback applications - [VisioForge.DotNet.MediaBlocks](https://www.nuget.org/packages/VisioForge.DotNet.MediaBlocks) - For applications requiring modular media processing capabilities ### Adding the UI Package, if needed Main SDK package contains the core UI components for WinForms, WPF, Android, and Apple. For other platforms, you'll need to install the appropriate UI package that corresponds to your chosen UI framework. ### Available UI Packages Depending on your target platform and UI framework, choose from these UI packages: - Core package contains the core UI components For WinForms, WPF, and Apple - [VisioForge.DotNet.Core.UI.WinUI](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.WinUI) - For Windows applications using the modern WinUI framework - [VisioForge.DotNet.Core.UI.MAUI](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.MAUI) - For cross-platform applications using .NET MAUI - [VisioForge.DotNet.Core.UI.Avalonia](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.Avalonia) - For cross-platform applications using Avalonia UI ## Integrating VideoView Control (Optional) ### Adding Video Preview Capabilities If your application requires video preview functionality, you'll need to add the VideoView control to your user interface. This can be accomplished either through XAML markup or programmatically in your code-behind file. Below, we'll demonstrate how to add it via XAML. #### Step 1: Add the WPF Namespace First, add the necessary namespace reference to your XAML file: ```xml xmlns:wpf="clr-namespace:VisioForge.Core.UI.WPF;assembly=VisioForge.Core" ``` #### Step 2: Add the VideoView Control Then, add the VideoView control to your layout: ```xml ``` This control provides a canvas where video content can be displayed in real-time, essential for applications that involve video capture, editing, or playback. ## Adding Required Redistribution Packages ### Platform-Specific Dependencies Depending on your target platform, chosen product, and the specific engine you're utilizing, additional redistribution packages may be needed to ensure proper functionality across all deployment environments. For comprehensive information about which redistribution packages are required for your specific scenario, please consult the Deployment documentation page for your selected VisioForge product. These resources provide detailed guidance on: - Required system dependencies - Platform-specific considerations - Deployment optimization strategies - Runtime requirements Following these deployment guidelines will ensure your application functions correctly on end-user systems without missing dependencies or runtime errors. ## Additional Resources For more examples and detailed implementation guides, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples), which contains numerous code samples demonstrating various features and integration scenarios. Our documentation portal also offers comprehensive API references, detailed tutorials, and best practice guides to help you make the most of VisioForge SDKs in your JetBrains Rider projects. ## Conclusion By following this installation guide, you've successfully integrated VisioForge .Net SDKs with JetBrains Rider, setting the foundation for developing powerful media applications. The combination of VisioForge's robust media processing capabilities and JetBrains Rider's intelligent development environment provides an ideal platform for creating sophisticated media applications across multiple platforms. ---END OF PAGE--- # Local File: .\dotnet\install\visual-studio-mac.md --- title: Integrate .NET SDKs with Visual Studio for Mac description: Learn how to install, configure, and implement .NET SDKs in Visual Studio for Mac for macOS and iOS development. This step-by-step guide covers environment setup, package installation, UI component configuration, and troubleshooting to help you build powerful multimedia applications for Apple platforms. sidebar_label: Visual Studio for Mac order: 13 --- # Complete Guide to Integrating VisioForge .NET SDKs with Visual Studio for Mac [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction to VisioForge SDKs on macOS VisioForge provides powerful multimedia SDKs for .NET developers working on macOS and iOS platforms. This detailed guide will walk you through the entire process of integrating these SDKs into your Visual Studio for Mac projects. While this tutorial primarily focuses on macOS application development, the same principles apply to iOS applications with minimal adaptations. By following this guide, you'll learn how to properly set up your development environment, install the necessary packages, configure UI components, and prepare your application for deployment. This knowledge will serve as a solid foundation for building sophisticated multimedia applications using VisioForge technology. ## Prerequisites for Development Before starting the integration process, ensure you have: - Visual Studio for Mac (latest version recommended) - .NET SDK installed (minimum version 6.0) - Basic knowledge of C# and .NET development - Administrative access to your macOS system - Active internet connection for NuGet package downloads - Optional: XCode for storyboard editing Having these prerequisites in place will ensure a smooth installation process and prevent common setup issues. ## Setting Up a New macOS Project Let's begin by creating a new macOS project in Visual Studio for Mac. This will serve as the foundation for our VisioForge SDK integration. ### Creating the Project Structure 1. Launch Visual Studio for Mac. 2. Select **File > New Solution** from the menu bar. 3. In the template selection dialog, navigate to **.NET > App**. 4. Choose **macOS Application** as your project template. 5. Configure your project settings, including: - Project name (choose something descriptive) - Organization identifier (typically in reverse domain format) - Target framework (.NET 6.0 or later recommended) - Solution name (can match your project name) 6. Click **Create** to generate your project template. This creates a basic macOS application with the standard project structure required for VisioForge SDK integration. ![Creating a new macOS project in Visual Studio for Mac](vsmac1.png) ## Installing VisioForge SDK Packages After creating your project, the next step is to install the necessary VisioForge SDK packages via NuGet. These packages contain the core functionality and UI components required for multimedia operations. ### Adding the Main SDK Package Each VisioForge product line has a dedicated main package that contains the core functionality. You'll need to choose the appropriate package based on your development requirements. 1. Right-click on your project in the Solution Explorer. 2. Select **Manage NuGet Packages** from the context menu. 3. Click on the **Browse** tab in the NuGet Package Manager. 4. In the search box, type "VisioForge" to find all available packages. 5. Select one of the following packages based on your requirements: Available NuGet packages: - [VisioForge.DotNet.VideoCapture](https://www.nuget.org/packages/VisioForge.DotNet.VideoCapture) - For video capture, webcam, and screen recording functionality - [VisioForge.DotNet.VideoEdit](https://www.nuget.org/packages/VisioForge.DotNet.VideoEdit) - For video editing, processing, and conversion - [VisioForge.DotNet.MediaPlayer](https://www.nuget.org/packages/VisioForge.DotNet.MediaPlayer) - For media playback and streaming - [VisioForge.DotNet.MediaBlocks](https://www.nuget.org/packages/VisioForge.DotNet.MediaBlocks) - For advanced media processing workflows 6. Click **Add Package** to install your selected package. 7. Accept any license agreements that appear. The installation process will automatically resolve dependencies and add references to your project. ![Installing the main SDK package via NuGet](vsmac2.png) ### Adding the Apple UI Package For macOS and iOS applications, you'll need the Apple-specific UI components that allow VisioForge SDKs to integrate with native UI elements. 1. In the NuGet Package Manager, search for "VisioForge.DotNet.UI.Apple". 2. Select the package from the results list. 3. Click **Add Package** to install. This package includes specialized controls designed specifically for Apple platforms, ensuring proper visual integration and performance optimization. ![Installing the Apple UI package via NuGet](vsmac3.png) ## Integrating Video Preview Capabilities Most multimedia applications require video preview functionality. VisioForge SDKs provide specialized controls for this purpose that integrate seamlessly with macOS applications. ### Adding the VideoView Control The VideoView control is the primary component for displaying video content in your application. Here's how to add it to your interface: 1. Open your application's main storyboard file by double-clicking it in the Solution Explorer. 2. Visual Studio for Mac will open XCode Interface Builder for storyboard editing. 3. From the Object Library, find the **Custom View** control. 4. Drag the Custom View control onto your window where you want the video to appear. 5. Set appropriate constraints to ensure proper sizing and positioning. 6. Using the Identity Inspector, set a descriptive name for your Custom View (e.g., "videoViewHost"). 7. Save your changes and return to Visual Studio for Mac. This Custom View will serve as a container for the VisioForge VideoView control, which will be added programmatically. ![Adding a Custom View in XCode Interface Builder](vsmac4.png) ![Setting properties for the Custom View](vsmac5.png) ### Initializing the VideoView in Code After adding the container Custom View, you need to initialize the VideoView control programmatically: 1. Open your ViewController.cs file. 2. Add the necessary using directives at the top of the file: ```csharp using VisioForge.Core.UI.Apple; using CoreGraphics; ``` 3. Add a private field to your ViewController class to hold the VideoView reference: ```csharp private VideoViewGL _videoView; ``` 4. Modify the ViewDidLoad method to initialize and add the VideoView: ```csharp public override void ViewDidLoad() { base.ViewDidLoad(); // Create and add VideoView _videoView = new VideoViewGL(new CGRect(0, 0, videoViewHost.Bounds.Width, videoViewHost.Bounds.Height)); this.videoViewHost.AddSubview(_videoView); // Configure VideoView properties _videoView.AutoresizingMask = Foundation.NSViewResizingMask.WidthSizable | Foundation.NSViewResizingMask.HeightSizable; _videoView.BackgroundColor = NSColor.Black; // Additional initialization code InitializeMediaComponents(); } private void InitializeMediaComponents() { // Initialize your VisioForge SDK components here // For example, for MediaPlayer: // var player = new MediaPlayer(); // player.VideoView = _videoView; // Additional configuration... } ``` This code creates a new VideoViewGL instance (optimized for hardware acceleration), sizes it to match your container view, and adds it as a subview. The AutoresizingMask property ensures that the video view resizes properly when the window size changes. ## Adding Required Redistribution Packages VisioForge SDKs rely on various native libraries and components that must be included in your application bundle. These dependencies vary based on the specific SDK you're using and your target platform. Check the [deployment documentation](../deployment-x/index.md) for detailed information on which redistribution packages are required for your specific scenario. ## Troubleshooting Common Issues If you encounter issues during installation or integration, consider these common solutions: 1. **Missing dependencies**: Ensure all required redistribution packages are installed 2. **Build errors**: Verify that your project targets a compatible .NET version 3. **Runtime crashes**: Check for platform-specific initialization issues 4. **Black video display**: Verify that the VideoView is properly initialized and added to the view hierarchy 5. **Performance issues**: Consider enabling hardware acceleration where available For more specific troubleshooting guidance, refer to the VisioForge documentation or contact their support team. ## Next Steps and Resources Now that you've successfully integrated VisioForge SDKs into your Visual Studio for Mac project, you can explore more advanced features and capabilities: - Create custom video processing workflows - Implement recording and capture functionality - Develop sophisticated media editing features - Build streaming media applications ### Additional Resources - Visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) for code samples and example projects - Join the [developer forum](https://support.visioforge.com/) to connect with other developers - Subscribe to our newsletter for updates on new features and best practices By following this guide, you've established a solid foundation for developing powerful multimedia applications on macOS and iOS using VisioForge SDKs and Visual Studio for Mac. ---END OF PAGE--- # Local File: .\dotnet\install\visual-studio.md --- title: Integrating .NET SDKs with Visual Studio description: Learn how to properly install and configure multimedia .NET SDKs in Microsoft Visual Studio with this detailed step-by-step guide. Covers NuGet package installation, manual setup methods, UI framework integration, and best practices for professional video capture and editing applications. sidebar_label: Visual Studio order: 14 --- # Comprehensive Guide to Integrating .NET SDKs with Visual Studio [!badge size="xl" target="blank" variant="info" text="Video Capture SDK .Net"](https://www.visioforge.com/video-capture-sdk-net) [!badge size="xl" target="blank" variant="info" text="Video Edit SDK .Net"](https://www.visioforge.com/video-edit-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction to VisioForge .NET SDKs VisioForge offers a powerful suite of multimedia SDKs for .NET developers, enabling you to build feature-rich applications with advanced video capture, editing, playback, and media processing capabilities. This comprehensive guide will walk you through the process of integrating these SDKs into your Visual Studio projects, ensuring a smooth development experience. For professional developers working on multimedia applications, properly integrating these SDKs is crucial for optimal performance and functionality. Our recommended approach is to use NuGet packages, which simplifies dependency management and ensures you're always using the latest features and bug fixes. ## Installation Methods Overview There are two primary methods to install VisioForge .NET SDKs: 1. **NuGet Package Installation** (Recommended): The modern, streamlined approach that handles dependencies automatically and simplifies updates. 2. **Manual Installation**: A traditional approach for specialized scenarios, though generally not recommended for most projects. We'll cover both methods in detail, but strongly encourage the NuGet approach for most development scenarios. ## NuGet Package Installation (Recommended Method) NuGet is the package manager for .NET, providing a centralized way to incorporate libraries into your projects without the hassle of manual file management. Here's a detailed walkthrough of integrating VisioForge SDKs using NuGet. ### Step 1: Create or Open Your .NET Project First, you'll need a WinForms, WPF, or other .NET project. We recommend using the modern SDK-style project format for optimal compatibility. #### Creating a New Project 1. Launch Visual Studio (2019 or 2022 recommended) 2. Select "Create a new project" 3. Filter templates by "C#" and either "WPF" or "Windows Forms" 4. Choose "WPF Application" or "Windows Forms Application" with the .NET Core/5/6+ framework 5. Ensure you select the modern SDK-style project format (this is the default in newer Visual Studio versions) ![Creating a new WPF project with modern SDK project format](vs1.png) #### Configuring the Project After creating a new project, you'll need to configure basic settings: 1. Enter your project name (use a descriptive name relevant to your application) 2. Choose an appropriate location and solution name 3. Select your target framework (.NET 6 or newer recommended for best performance and features) 4. Click "Create" to generate the project structure ![Selecting project name and configuration options](vs2.png) ### Step 2: Access NuGet Package Manager Once your project is open in Visual Studio: 1. Right-click on your project in Solution Explorer 2. Select "Manage NuGet Packages..." from the context menu 3. The NuGet Package Manager will open in the center pane This interface provides search functionality and package browsing to easily find and install the VisioForge components you need. ![Accessing the NuGet Package Manager](vs3.png) ### Step 3: Install the UI Package for Your Framework VisioForge SDKs offer specialized UI components for different .NET frameworks. You'll need to select the appropriate UI package based on your project type. 1. In the NuGet Package Manager, switch to the "Browse" tab 2. Search for "VisioForge.DotNet.Core.UI" 3. Select the appropriate UI package for your project type from the search results ![Adding the WPF UI package through NuGet](vs4.png) #### Available UI Packages VisioForge supports a wide range of UI frameworks. Choose the one that matches your project: - **[VisioForge.DotNet.Core.UI.WinUI](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.WinUI)**: For modern Windows UI applications - **[VisioForge.DotNet.Core.UI.MAUI](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.MAUI)**: For cross-platform applications using .NET MAUI - **[VisioForge.DotNet.Core.UI.Avalonia](https://www.nuget.org/packages/VisioForge.DotNet.Core.UI.Avalonia)**: For cross-platform desktop applications using Avalonia UI These UI packages provide the necessary controls and components specifically designed for video rendering and interaction within your chosen framework. ### Step 4: Install the Core SDK Package After installing the UI package, you'll need to add the main SDK package for your specific multimedia needs: 1. Return to the NuGet Package Manager "Browse" tab 2. Search for the specific VisioForge SDK you need (e.g., "VisioForge.DotNet.VideoCapture") 3. Click "Install" on the appropriate package ![Installing the main SDK package](vs5.png) #### Available Core SDK Packages Choose the SDK that aligns with your application's requirements: - **[VisioForge.DotNet.VideoCapture](https://www.nuget.org/packages/VisioForge.DotNet.VideoCapture)**: For applications that need to capture video from cameras, screen recording, or other sources - **[VisioForge.DotNet.VideoEdit](https://www.nuget.org/packages/VisioForge.DotNet.VideoEdit)**: For video editing, processing, and conversion applications - **[VisioForge.DotNet.MediaPlayer](https://www.nuget.org/packages/VisioForge.DotNet.MediaPlayer)**: For creating media players with advanced playback controls - **[VisioForge.DotNet.MediaBlocks](https://www.nuget.org/packages/VisioForge.DotNet.MediaBlocks)**: For building complex media processing pipelines Each package includes comprehensive documentation, and you can install multiple packages if your application requires different multimedia capabilities. ### Step 5: Implementing the VideoView Control (Optional) The VideoView control is crucial for applications that need to display video content. You can add it to your UI using XAML (for WPF) or through the designer (for WinForms). #### For WPF Applications Add the required namespace to your XAML file: ```xml xmlns:wpf="clr-namespace:VisioForge.Core.UI.WPF;assembly=VisioForge.Core" ``` Then add the VideoView control to your layout: ```xml ``` ![XAML code for adding VideoView](vs6.png) The VideoView control will appear in your designer: ![VideoView control in the application window](vs7.png) #### For WinForms Applications 1. Open the form in designer mode 2. Locate the VisioForge controls in the toolbox (if they don't appear, right-click the toolbox and select "Choose Items...") 3. Drag and drop the VideoView control onto your form 4. Adjust the size and position properties as needed ### Step 6: Install Required Redistribution Packages Depending on your specific implementation, you may need additional redistribution packages: 1. Return to the NuGet Package Manager 2. Search for "VisioForge.DotNet.Redist" to see available redistribution packages 3. Install the ones relevant to your platform and SDK choice ![Installing redistribution packages](vs8.png) The required redistribution packages vary based on: - Target operating system (Windows, macOS, Linux) - Hardware acceleration requirements - Specific codecs and formats your application will use - Backend engine configuration Consult the specific Deployment documentation page for your selected product to determine which redistribution packages are necessary for your application. ## Manual Installation (Alternative Method) While we generally don't recommend manual installation due to its complexity and potential for configuration issues, there are specific scenarios where it might be necessary. Follow these steps if NuGet isn't an option for your project: 1. Download the [complete SDK installer](https://files.visioforge.com/trials/visioforge_sdks_installer_dotnet_setup.exe) from our website 2. Run the installer with administrator privileges and follow the on-screen instructions 3. Create your WinForms or WPF project in Visual Studio 4. Add references to the installed SDK libraries: - Right-click "References" in Solution Explorer - Select "Add Reference" - Navigate to the installed SDK location - Select the required DLL files 5. Configure the Visual Studio Toolbox: - Right-click the Toolbox and select "Add Tab" - Name the new tab "VisioForge" - Right-click the tab and select "Choose Items..." - Browse to the SDK installation directory - Select `VisioForge.Core.dll` 6. Drag and drop the VideoView control onto your form or window This manual approach requires additional configuration for deployment and updates must be managed manually. ## Advanced Configuration and Best Practices For production applications, consider these additional implementation details: - **License Management**: Implement proper license validation at application startup - **Error Handling**: Add comprehensive error handling around SDK initialization and operation - **Performance Optimization**: Configure hardware acceleration and threading based on your target devices - **Resource Management**: Implement proper disposal of SDK resources to prevent memory leaks ## Troubleshooting Common Issues If you encounter problems during installation or implementation: - Verify your project targets a supported .NET version - Ensure all required redistributable packages are installed - Check for NuGet package version compatibility - Review the SDK documentation for platform-specific requirements ## Conclusion and Next Steps With the VisioForge .NET SDKs properly installed in your Visual Studio project, you're now ready to leverage their powerful multimedia capabilities. The NuGet installation method ensures you have the correct dependencies and simplifies future updates. To deepen your understanding and maximize the potential of these SDKs: - Explore our [comprehensive code samples on GitHub](https://github.com/visioforge/.Net-SDK-s-samples) - Review the product-specific documentation for advanced features - Join our developer community forums for support and best practices By following this guide, you've established a solid foundation for developing sophisticated multimedia applications with VisioForge and Visual Studio. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\index.md --- title: Media Blocks SDK for .NET Integration Guide description: Learn how to leverage Media Blocks SDK for .NET to build powerful multimedia applications. Discover how to play, edit, and capture video content with our modular SDK designed for developers. Explore our extensive guide to video encoding, processing, and rendering features. sidebar_label: Media Blocks SDK .Net order: 14 --- # Media Blocks SDK for .NET Development Platform [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## What is Media Blocks SDK? Media Blocks SDK for .NET empowers developers to engineer sophisticated multimedia applications with precision and flexibility. This powerful toolkit provides everything needed to implement professional-grade video playback, non-linear editing systems, and multi-source camera capture solutions. The modular architecture allows developers to select and combine only the specific components required for each project, optimizing both performance and resource usage in your applications. ## Why Choose Media Blocks for Your Project? Our component-based approach gives you granular control over your media pipeline. Each specialized block handles a distinct function within the multimedia processing chain: - High-performance H264/H265 video encoding - Professional-grade logo and watermark insertion - Multi-stream mixing and composition - Hardware-accelerated video rendering - Cross-platform compatibility This modular design enables you to construct precisely the multimedia processing workflow your application requires, without unnecessary overhead. [Get Started with Media Blocks SDK](GettingStarted/index.md) ## Core SDK Components and Capabilities ### Audio Processing Components - [Audio Encoders](AudioEncoders/index.md) - Convert raw audio streams to AAC, MP3, and other compressed formats with customizable quality settings - [Audio Processing](AudioProcessing/index.md) - Apply dynamic filters, enhance sound quality, and manipulate audio characteristics in real-time - [Audio Rendering](AudioRendering/index.md) - Output processed audio to physical devices with precise timing and synchronization ### Video Processing Components - [Video Encoders](VideoEncoders/index.md) - Generate optimized video streams with support for multiple codecs and container formats - [Video Processing](VideoProcessing/index.md) - Transform, filter and enhance video content with effects, color correction, and image adjustments - [Video Rendering](VideoRendering/index.md) - Display video content across different output technologies with hardware acceleration - [Live Video Compositor](LiveVideoCompositor/index.md) - Combine multiple video sources in real-time with transitions and effects ### Input/Output System Components - [Bridges](Bridge/index.md) - Connect and synchronize different component types within your processing pipeline - [Decklink](Decklink/index.md) - Integrate with professional Blackmagic Design video capture and playback hardware - [Sinks](Sinks/index.md) - Direct processed media to files, streams, network destinations, and other output targets - [Sources](Sources/index.md) - Ingest media from cameras, files, network streams, and other input devices - [Special](Special/index.md) - Implement specialized functionality with our extended component collection ## Essential Developer Resources - [Deployment Guide](../deployment-x/index.md) - [Changelog](../changelog.md) - [End User License Agreement](../../eula.md) - [API Documentation](https://api.visioforge.com/dotnet/api/index.html) ## Technical Support and Community Our dedicated development team provides responsive support to ensure your success with Media Blocks SDK. Join our active developer community to exchange implementation strategies, optimization techniques, and custom solutions. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\AudioEncoders\index.md --- title: Audio Encoders for .NET Media Processing description: Comprehensive guide to audio compression formats including AAC, MP3, FLAC, and more with VisioForge Media Blocks SDK for .NET. Learn implementation with code examples. sidebar_label: Audio Encoders order: 19 --- # Audio encoders blocks [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Audio encoding is the process of converting raw audio data into a compressed format. This process is essential for reducing the size of audio files, making them easier to store and stream over the internet. VisioForge Media Blocks SDK provides a wide range of audio encoders that support various formats and codecs. ## Availability checks Before using any encoder, you should check if it's available on the current platform. Each encoder block provides a static `IsAvailable()` method for this purpose: ```csharp // For most encoders if (EncoderBlock.IsAvailable()) { // Use the encoder } // For AAC encoder which requires passing settings if (AACEncoderBlock.IsAvailable(settings)) { // Use the AAC encoder } ``` This check is important because not all encoders are available on all platforms. Always perform this check before attempting to use an encoder to avoid runtime errors. ## AAC encoder `AAC (Advanced Audio Coding)`: A lossy compression format known for its efficiency and superior sound quality compared to MP3, widely used in digital music and broadcasting. AAC encoder is used for encoding files in MP4, MKV, M4A and some other formats, as well as for network streaming using RTSP and HLS. Use the `AACEncoderSettings` class to set the parameters. ### Block info Name: AACEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | AAC | 1 ### Constructor options ```csharp // Constructor with custom settings public AACEncoderBlock(IAACEncoderSettings settings) // Constructor without parameters (uses default settings) public AACEncoderBlock() // Uses GetDefaultSettings() internally ``` ### Settings The `AACEncoderBlock` works with any implementation of the `IAACEncoderSettings` interface. Different implementations are available depending on the platform: - `AVENCAACEncoderSettings` - Available on Windows and macOS/Linux (preferred when available) - `MFAACEncoderSettings` - Windows Media Foundation implementation (Windows only) - `VOAACEncoderSettings` - Used on Android and iOS You can use the static `GetDefaultSettings()` method to get the optimal encoder settings for the current platform: ```csharp var settings = AACEncoderBlock.GetDefaultSettings(); ``` ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AACEncoderBlock; AACEncoderBlock-->MP4SinkBlock; ``` ### Sample code ```cs var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var aacEncoderBlock = new AACEncoderBlock(new MFAACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, aacEncoderBlock.Input); var m4aSinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.m4a")); pipeline.Connect(aacEncoderBlock.Output, m4aSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## ADPCM encoder `ADPCM (Adaptive Differential Pulse Code Modulation)`: A type of audio compression that reduces the bit rate required for audio storage and transmission while maintaining audio quality through adaptive prediction. ADPCM encoder is used for embedding audio streams in DV, WAV and AVI formats. Use the `ADPCMEncoderSettings` class to set the parameters. ### Block info Name: ADPCMEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | ADPCM | 1 ### Constructor options ```csharp // Constructor with block align parameter public ADPCMEncoderBlock(int blockAlign = 1024) ``` The `blockAlign` parameter defines the block alignment in bytes. The default value is 1024. ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ADPCMEncoderBlock; ADPCMEncoderBlock-->WAVSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var adpcmEncoderBlock = new ADPCMEncoderBlock(new ADPCMEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, adpcmEncoderBlock.Input); var wavSinkBlock = new WAVSinkBlock(@"output.wav"); pipeline.Connect(adpcmEncoderBlock.Output, wavSinkBlock.Input); await pipeline.StartAsync(); ``` ## ALAW encoder `ALAW (A-law algorithm)`: A standard companding algorithm used in digital communications systems to optimize the dynamic range of an analog signal for digitizing. ALAW encoder is used for embedding audio streams in WAV format or transmitting over IP. Use the `ALAWEncoderSettings` class to set the parameters. ### Block info Name: ALAWEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | ALAW | 1 ### Constructor options ```csharp // Default constructor public ALAWEncoderBlock() ``` ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ALAWEncoderBlock; ALAWEncoderBlock-->WAVSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var alawEncoderBlock = new ALAWEncoderBlock(new ALAWEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, alawEncoderBlock.Input); var wavSinkBlock = new WAVSinkBlock(@"output.wav"); pipeline.Connect(alawEncoderBlock.Output, wavSinkBlock.Input); await pipeline.StartAsync(); ``` ## FLAC encoder `FLAC (Free Lossless Audio Codec)`: A lossless audio compression format that preserves audio quality while significantly reducing file size compared to uncompressed formats like WAV. FLAC encoder is used for encoding audio in FLAC format. Use the `FLACEncoderSettings` class to set the parameters. ### Block info Name: FLACEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | FLAC | 1 ### Constructor options ```csharp // Constructor with settings public FLACEncoderBlock(FLACEncoderSettings settings) ``` ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->FLACEncoderBlock; FLACEncoderBlock-->FileSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var flacEncoderBlock = new FLACEncoderBlock(new FLACEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, flacEncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.flac"); pipeline.Connect(flacEncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` ## MP2 encoder `MP2 (MPEG-1 Audio Layer II)`: An older audio compression format that preceded MP3, still used in some broadcasting applications due to its efficiency at specific bitrates. MP2 encoder is used for transmitting over IP or embedding to AVI/MPEG-2 formats. Use the `MP2EncoderSettings` class to set the parameters. ### Block info Name: MP2EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | audio/mpeg | 1 ### Constructor options ```csharp // Constructor with settings public MP2EncoderBlock(MP2EncoderSettings settings) ``` The `MP2EncoderSettings` class allows you to configure parameters such as: - Bitrate (default: 192 kbps) ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP2EncoderBlock; MP2EncoderBlock-->FileSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mp2EncoderBlock = new MP2EncoderBlock(new MP2EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, mp2EncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.mp2"); pipeline.Connect(mp2EncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` ## MP3 encoder `MP3 (MPEG Audio Layer III)`: A popular lossy audio format that revolutionized digital music distribution by compressing files while retaining a reasonable sound quality. An MP3 encoder can convert audio streams into MP3 files or embed MP3 audio streams in formats like AVI, MKV, and others. Use the `MP3EncoderSettings` class to set the parameters. ### Block info Name: MP3EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | audio/mpeg | 1 ### Constructor options ```csharp // Constructor with settings and optional parser flag public MP3EncoderBlock(MP3EncoderSettings settings, bool addParser = false) ``` The `addParser` parameter is used to add a parser to the output stream, which is required for certain streaming applications like RTMP (YouTube/Facebook) streaming. ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP3EncoderBlock; MP3EncoderBlock-->FileSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mp3EncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, mp3EncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.mp3"); pipeline.Connect(mp3EncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` ### Streaming to RTMP example ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Add parser is set to true for RTMP streaming var mp3EncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 }, addParser: true); pipeline.Connect(fileSource.AudioOutput, mp3EncoderBlock.Input); // Connect to RTMP sink var rtmpSink = new RTMPSinkBlock(new RTMPSinkSettings("rtmp://streaming-server/live/stream")); pipeline.Connect(mp3EncoderBlock.Output, rtmpSink.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## OPUS encoder `OPUS`: A highly efficient lossy audio compression format designed for the internet with low latency and high audio quality, making it ideal for real-time applications like WebRTC. OPUS encoder is used for embedding audio streams in WebM or OGG formats. Use the `OPUSEncoderSettings` class to set the parameters. ### Block info Name: OPUSEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | OPUS | 1 ### Constructor options ```csharp // Constructor with settings public OPUSEncoderBlock(OPUSEncoderSettings settings) ``` The `OPUSEncoderSettings` class allows you to configure parameters such as: - Bitrate (default: 128 kbps) - Audio bandwidth - Frame size and other encoding parameters ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->OPUSEncoderBlock; OPUSEncoderBlock-->WebMSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var opusEncoderBlock = new OPUSEncoderBlock(new OPUSEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, opusEncoderBlock.Input); var webmSinkBlock = new WebMSinkBlock(new WebMSinkSettings(@"output.webm")); pipeline.Connect(opusEncoderBlock.Output, webmSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Speex encoder `Speex`: A patent-free audio compression format designed specifically for speech, offering high compression rates while maintaining clarity for voice recordings. Speex encoder is used for embedding audio streams in OGG format. Use the `SpeexEncoderSettings` class to set the parameters. ### Block info Name: SpeexEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | Speex | 1 ### Constructor options ```csharp // Constructor with settings public SpeexEncoderBlock(SpeexEncoderSettings settings) ``` The `SpeexEncoderSettings` class allows you to configure parameters such as: - Mode (SpeexMode): NarrowBand, WideBand, UltraWideBand - Quality - Complexity - VAD (Voice Activity Detection) - DTX (Discontinuous Transmission) ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SpeexEncoderBlock; SpeexEncoderBlock-->OGGSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var speexEncoderBlock = new SpeexEncoderBlock(new SpeexEncoderSettings() { Mode = SpeexMode.NarrowBand }); pipeline.Connect(fileSource.AudioOutput, speexEncoderBlock.Input); var oggSinkBlock = new OGGSinkBlock(@"output.ogg"); pipeline.Connect(speexEncoderBlock.Output, oggSinkBlock.Input); await pipeline.StartAsync(); ``` ## Vorbis encoder `Vorbis`: An open-source, lossy audio compression format designed as a free alternative to MP3, often used within the OGG container format. Vorbis encoder is used for embedding audio streams in OGG or WebM formats. Use the `VorbisEncoderSettings` class to set the parameters. ### Block info Name: VorbisEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | Vorbis | 1 ### Constructor options ```csharp // Constructor with settings public VorbisEncoderBlock(VorbisEncoderSettings settings) ``` The `VorbisEncoderSettings` class allows you to configure parameters such as: - BaseQuality: A float value between 0.0 and 1.0 that determines the quality of the encoded audio - Bitrate: Alternative bitrate-based configuration ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VorbisEncoderBlock; VorbisEncoderBlock-->OGGSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vorbisEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings() { BaseQuality = 0.5f }); pipeline.Connect(fileSource.AudioOutput, vorbisEncoderBlock.Input); var oggSinkBlock = new OGGSinkBlock(@"output.ogg"); pipeline.Connect(vorbisEncoderBlock.Output, oggSinkBlock.Input); await pipeline.StartAsync(); ``` ## WAV encoder `WAV (Waveform Audio File Format)`: An uncompressed audio format that preserves audio quality but results in larger file sizes compared to compressed formats. WAV encoder is used for encoding audio into WAV format. Use the `WAVEncoderSettings` class to set the parameters. ### Block info Name: WAVEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | WAV | 1 ### Constructor options ```csharp // Constructor with settings public WAVEncoderBlock(WAVEncoderSettings settings) ``` The `WAVEncoderSettings` class allows you to configure various parameters for the WAV format. ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WAVEncoderBlock; WAVEncoderBlock-->FileSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wavEncoderBlock = new WAVEncoderBlock(new WAVEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, wavEncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.wav"); pipeline.Connect(wavEncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` ## WavPack encoder `WavPack`: A free and open-source lossless audio compression format that offers high compression rates while maintaining excellent audio quality, supporting hybrid lossy/lossless modes. WavPack encoder is used for encoding audio in WavPack format, which is ideal for archiving audio with perfect fidelity. Use the `WavPackEncoderSettings` class to set the parameters. ### Block info Name: WavPackEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | WavPack | 1 ### Constructor options ```csharp // Constructor with settings public WavPackEncoderBlock(WavPackEncoderSettings settings) ``` ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WavPackEncoderBlock; WavPackEncoderBlock-->FileSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wavpackEncoderBlock = new WavPackEncoderBlock(new WavPackEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, wavpackEncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.wv"); pipeline.Connect(wavpackEncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` ## WMA encoder `WMA (Windows Media Audio)`: A proprietary audio compression format developed by Microsoft, offering various compression levels and features for different audio applications. WMA encoder is used for encoding audio in WMA format. Use the `WMAEncoderSettings` class to set the parameters. ### Block info Name: WMAEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | PCM/IEEE | 1 Output | WMA | 1 ### Constructor options ```csharp // Constructor with settings public WMAEncoderBlock(WMAEncoderSettings settings) ``` The `WMAEncoderSettings` class allows you to configure parameters such as: - Bitrate (default: 128 kbps) - Quality settings - VBR (Variable Bit Rate) options ### Default settings You can use the static method to get default settings: ```csharp var settings = WMAEncoderBlock.GetDefaultSettings(); ``` ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WMAEncoderBlock; WMAEncoderBlock-->ASFSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wmaEncoderBlock = new WMAEncoderBlock(new WMAEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, wmaEncoderBlock.Input); var asfSinkBlock = new ASFSinkBlock(@"output.wma"); pipeline.Connect(wmaEncoderBlock.Output, asfSinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); await pipeline.StartAsync(); ``` ## Resource management All encoder blocks implement `IDisposable` and have internal cleanup mechanisms. It's recommended to properly dispose of them when they're no longer needed: ```csharp // Using block using (var encoder = new MP3EncoderBlock(settings)) { // Use encoder } // Or manual disposal var encoder = new MP3EncoderBlock(settings); try { // Use encoder } finally { encoder.Dispose(); } ``` ## Platforms Windows, macOS, Linux, iOS, Android. Note that not all encoders are available on all platforms. Always use the `IsAvailable()` method to check for availability before using an encoder. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\AudioProcessing\index.md --- title: .Net Audio Processing & Effect Blocks description: Explore a comprehensive set of .NET audio processing and effect blocks for building powerful audio pipelines. Includes converters, resamplers, mixers, EQs, and more. sidebar_label: Audio Processing and Effects --- # Audio processing and effect blocks [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) VisioForge Media Blocks SDK .Net includes a set of audio processing and effect blocks that allow you to create audio processing pipelines for your applications. The blocks can be connected to each other to create a processing pipeline. Most of the blocks are available for all platforms, including Windows, Linux, MacOS, Android, and iOS. ## Basic Audio Processing ### Audio Converter The audio converter block converts audio from one format to another. #### Block info Name: AudioConverterBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AudioConverterBlock; AudioConverterBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioConverter = new AudioConverterBlock(); pipeline.Connect(fileSource.AudioOutput, audioConverter.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioConverter.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Audio Resampler The audio resampler block changes the sample rate of an audio stream. #### Block info Name: AudioResamplerBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AudioResamplerBlock; AudioResamplerBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Resample to 48000 Hz, stereo var settings = new AudioResamplerSettings(AudioFormatX.S16LE, 48000, 2); var audioResampler = new AudioResamplerBlock(settings); pipeline.Connect(fileSource.AudioOutput, audioResampler.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioResampler.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Audio Timestamp Corrector The audio timestamp corrector block can add or remove frames to correct input stream from unstable sources. #### Block info Name: AudioTimestampCorrectorBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AudioTimestampCorrectorBlock; AudioTimestampCorrectorBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var settings = new AudioTimestampCorrectorSettings(); var corrector = new AudioTimestampCorrectorBlock(settings); pipeline.Connect(fileSource.AudioOutput, corrector.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(corrector.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Volume The volume block allows you to control the volume of the audio stream. #### Block info Name: VolumeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VolumeBlock; VolumeBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Volume: 0.0 (silence) to 1.0 (normal) or higher (amplification) var volume = new VolumeBlock(0.8); pipeline.Connect(fileSource.AudioOutput, volume.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(volume.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Audio mixer The audio mixer block mixes multiple audio streams into one. Block mixes the streams regardless of their format, converting if necessary. All input streams will be synchronized. The mixer block handles the conversion of different input audio formats to a common format for mixing. By default, it will try to match the format of the first connected input, but this can be explicitly configured. Use the `AudioMixerSettings` class to set the custom output format. This is useful if you need a specific sample rate, channel layout, or audio format (like S16LE, Float32LE, etc.) for the mixed output. #### Block info Name: AudioMixerBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 (dynamically created) Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; VirtualAudioSourceBlock#1-->AudioMixerBlock; VirtualAudioSourceBlock#2-->AudioMixerBlock; AudioMixerBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var audioSource1Block = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var audioSource2Block = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // Configure the mixer with specific output settings if needed // For example, to output 48kHz, 2-channel, S16LE audio: // var mixerSettings = new AudioMixerSettings() { Format = new AudioInfoX(AudioFormatX.S16LE, 48000, 2) }; // var audioMixerBlock = new AudioMixerBlock(mixerSettings); var audioMixerBlock = new AudioMixerBlock(new AudioMixerSettings()); // Each call to CreateNewInput() adds a new input to the mixer var inputPad1 = audioMixerBlock.CreateNewInput(); pipeline.Connect(audioSource1Block.Output, inputPad1); var inputPad2 = audioMixerBlock.CreateNewInput(); pipeline.Connect(audioSource2Block.Output, inputPad2); // Output the mixed audio to the default audio renderer var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioMixerBlock.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Controlling Individual Input Streams You can control the volume and mute state of individual input streams connected to the `AudioMixerBlock`. The `streamIndex` for these methods corresponds to the order in which the inputs were added via `CreateNewInput()` or `CreateNewInputLive()` (starting from 0). * **Set Volume**: Use the `SetVolume(int streamIndex, double value)` method. The `value` ranges from 0.0 (silence) to 1.0 (normal volume), and can be higher for amplification (e.g., up to 10.0, though specifics might depend on the underlying implementation limits). * **Set Mute**: Use the `SetMute(int streamIndex, bool value)` method. Set `value` to `true` to mute the stream and `false` to unmute it. ```csharp // Assuming audioMixerBlock is already created and inputs are connected // Set volume of the first input stream (index 0) to 50% audioMixerBlock.SetVolume(0, 0.5); // Mute the second input stream (index 1) audioMixerBlock.SetMute(1, true); ``` #### Dynamic Input Management (Live Pipeline) The `AudioMixerBlock` supports adding and removing inputs dynamically while the pipeline is running: * **Adding Inputs**: Use the `CreateNewInputLive()` method to get a new input pad that can be connected to a source. The underlying GStreamer elements will be set up to handle the new input. * **Removing Inputs**: Use the `RemoveInputLive(MediaBlockPad blockPad)` method. This will disconnect the specified input pad and clean up associated resources. This is particularly useful for applications where the number of audio sources can change during operation, such as a live mixing console or a conferencing application. #### Platforms Windows, macOS, Linux, iOS, Android. ### Audio sample grabber The audio sample grabber block allows you to access the raw audio samples from the audio stream. #### Block info Name: AudioSampleGrabberBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AudioSampleGrabberBlock; AudioSampleGrabberBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioSampleGrabber = new AudioSampleGrabberBlock(); audioSampleGrabber.SampleGrabbed += (sender, args) => { // Process audio samples // args.AudioData - audio samples // args.AudioFormat - audio format }; pipeline.Connect(fileSource.AudioOutput, audioSampleGrabber.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioSampleGrabber.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Audio Effects ### Amplify Block amplifies an audio stream by an amplification factor. Several clipping modes are available. Use method and level values to configure. #### Block info Name: AmplifyBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AmplifyBlock; AmplifyBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var amplify = new AmplifyBlock(AmplifyClippingMethod.Normal, 2.0); pipeline.Connect(fileSource.AudioOutput, amplify.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(amplify.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Echo The echo block adds echo effect to the audio stream. #### Block info Name: EchoBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->EchoBlock; EchoBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Delay in ms, strength 0.0 - 1.0 var echo = new EchoBlock(500, 0.5); pipeline.Connect(fileSource.AudioOutput, echo.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(echo.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Karaoke The karaoke block applies a karaoke effect to the audio stream, removing center-panned vocals. #### Block info Name: KaraokeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->KaraokeBlock; KaraokeBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var settings = new KaraokeAudioEffect(); var karaoke = new KaraokeBlock(settings); pipeline.Connect(fileSource.AudioOutput, karaoke.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(karaoke.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Reverberation The reverberation block adds reverb effects to the audio stream. #### Block info Name: ReverberationBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ReverberationBlock; ReverberationBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var settings = new ReverberationAudioEffect(); var reverb = new ReverberationBlock(settings); pipeline.Connect(fileSource.AudioOutput, reverb.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(reverb.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Wide Stereo The wide stereo block enhances the stereo image of the audio. #### Block info Name: WideStereoBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WideStereoBlock; WideStereoBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var settings = new WideStereoAudioEffect(); var wideStereo = new WideStereoBlock(settings); pipeline.Connect(fileSource.AudioOutput, wideStereo.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(wideStereo.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Equalization and Filtering ### Balance Block allows you to control the balance between left and right channels. #### Block info Name: AudioBalanceBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AudioBalanceBlock; AudioBalanceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Balance: -1.0 (full left) to 1.0 (full right), 0.0 - center var balance = new AudioBalanceBlock(0.5); pipeline.Connect(fileSource.AudioOutput, balance.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(balance.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Equalizer (10 bands) The 10-band equalizer block provides a 10-band equalizer for audio processing. #### Block info Name: Equalizer10Block. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->Equalizer10Block; Equalizer10Block-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Create 10-band equalizer with all bands set to 0 dB var equalizer = new Equalizer10Block(0, 0, 0, 0, 0, 0, 0, 0, 0, 0); // Or set bands individually equalizer.SetBand(0, 3); // Band 0 (31 Hz) to +3 dB equalizer.SetBand(1, 2); // Band 1 (62 Hz) to +2 dB equalizer.SetBand(9, -3); // Band 9 (16 kHz) to -3 dB pipeline.Connect(fileSource.AudioOutput, equalizer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(equalizer.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Equalizer (Parametric) The parametric equalizer block provides a parametric equalizer for audio processing. #### Block info Name: EqualizerParametricBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->EqualizerParametricBlock; EqualizerParametricBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Create parametric equalizer var equalizer = new EqualizerParametricBlock(); // Set up to 4 bands equalizer.SetBand(0, 100, 1.0, 3); // Band 0: 100 Hz frequency, 1.0 Q, +3 dB gain equalizer.SetBand(1, 1000, 1.5, -2); // Band 1: 1000 Hz frequency, 1.5 Q, -2 dB gain pipeline.Connect(fileSource.AudioOutput, equalizer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(equalizer.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Chebyshev Band Pass/Reject The Chebyshev band pass/reject block applies a band pass or band reject filter to the audio stream using Chebyshev filters. #### Block info Name: ChebyshevBandPassRejectBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ChebyshevBandPassRejectBlock; ChebyshevBandPassRejectBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var settings = new ChebyshevBandPassRejectAudioEffect(); var filter = new ChebyshevBandPassRejectBlock(settings); pipeline.Connect(fileSource.AudioOutput, filter.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(filter.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Chebyshev Limit The Chebyshev limit block applies low-pass or high-pass filtering to the audio using Chebyshev filters. #### Block info Name: ChebyshevLimitBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ChebyshevLimitBlock; ChebyshevLimitBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var settings = new ChebyshevLimitAudioEffect(); var filter = new ChebyshevLimitBlock(settings); pipeline.Connect(fileSource.AudioOutput, filter.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(filter.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Dynamic Processing ### Compressor/Expander The compressor/expander block provides dynamic range compression or expansion. #### Block info Name: CompressorExpanderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->CompressorExpanderBlock; CompressorExpanderBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var compressor = new CompressorExpanderBlock(0.5, 0.9, 0.1, 0.5); pipeline.Connect(fileSource.AudioOutput, compressor.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(compressor.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Scale/Tempo The scale/tempo block allows you to change the tempo and pitch of the audio stream. #### Block info Name: ScaleTempoBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ScaleTempoBlock; ScaleTempoBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Scale tempo by factor (1.0 is normal, 0.5 is half-speed, 2.0 is double-speed) var scaleTempo = new ScaleTempoBlock(1.5); pipeline.Connect(fileSource.AudioOutput, scaleTempo.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(scaleTempo.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Analysis and Metering ### VU Meter The VU meter block allows you to measure the volume level of the audio stream. #### Block info Name: VUMeterBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Uncompressed audio | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VUMeterBlock; VUMeterBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vuMeter = new VUMeterBlock(); vuMeter.VolumeUpdated += (sender, args) => { // Left channel volume in dB var leftVolume = args.LeftVolume; // Right channel volume in dB var rightVolume = args.RightVolume; Console.WriteLine($"Left: {leftVolume:F2} dB, Right: {rightVolume:F2} dB"); }; pipeline.Connect(fileSource.AudioOutput, vuMeter.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(vuMeter.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\AudioRendering\index.md --- title: Audio Rendering Block for .NET Media Processing description: Explore the powerful Audio Rendering Block for cross-platform audio output in .NET applications. Learn implementation techniques, performance optimization, and device management for Windows, macOS, Linux, iOS, and Android development. sidebar_label: Audio Rendering --- # Audio Rendering Block: Cross-Platform Audio Output Processing [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Audio Rendering The Audio Renderer Block serves as a critical component in media processing pipelines, enabling applications to output audio streams to sound devices across multiple platforms. This versatile block handles the complex task of converting digital audio data into audible sound through the appropriate hardware interfaces, making it an essential tool for developers building audio-enabled applications. Audio rendering requires careful management of hardware resources, buffer settings, and timing synchronization to ensure smooth, uninterrupted playback. This block abstracts these complexities and provides a unified interface for audio output across diverse computing environments. ## Core Functionality The Audio Renderer Block accepts uncompressed audio streams and outputs them to either the default audio device or a user-selected alternative. It provides essential audio playback controls including: - Volume adjustment with precise decibel control - Mute functionality for silent operation - Device selection from available system audio outputs - Buffering settings to optimize for latency or stability These capabilities allow developers to create applications with professional-grade audio output without needing to implement platform-specific code for each target operating system. ## Underlying Technology ### Platform-Specific Implementation The `AudioRendererBlock` supports various platform-specific audio rendering technologies. It can be configured to use a specific audio device and API (see Device Management section). When instantiated using its default constructor (e.g., `new AudioRendererBlock()`), it attempts to select a suitable default audio API based on the operating system: - **Windows**: The default constructor typically uses DirectSound. The block supports multiple audio APIs including: - DirectSound: Provides low-latency output with broad compatibility - WASAPI (Windows Audio Session API): Offers exclusive mode for highest quality - ASIO (Audio Stream Input/Output): Professional-grade audio with minimal latency for specialized hardware - **macOS**: Utilizes the CoreAudio framework. The default constructor will typically select a CoreAudio-based device for: - High-resolution audio output - Native integration with macOS audio subsystem - Support for audio units and professional equipment (Note: Similarly for macOS, an `OSXAudioSinkBlock` is available for direct interaction with the platform-specific GStreamer sink if needed for specialized scenarios.) - **Linux**: Implements ALSA (Advanced Linux Sound Architecture). The default constructor will typically select an ALSA-based device for: - Direct hardware access - Comprehensive device support - Integration with the Linux audio stack - **iOS**: Employs CoreAudio, optimized for mobile. The default constructor will typically select a CoreAudio-based device, enabling features like: - Power-efficient rendering - Background audio capabilities - Integration with iOS audio session management (Note: For developers requiring more direct control over the iOS-specific GStreamer sink or having advanced use cases, the SDK also provides `IOSAudioSinkBlock` as a distinct media block.) - **Android**: Defaults to using OpenSL ES to provide: - Low-latency audio output - Hardware acceleration when available ## OSXAudioSinkBlock: Direct macOS Audio Output The `OSXAudioSinkBlock` is a platform-specific media block designed for advanced scenarios where direct interaction with the macOS GStreamer audio sink is required. This block is useful for developers who need low-level control over audio output on macOS devices, such as custom device selection or integration with other native components. ### Key Features - Direct access to the macOS audio sink - Device selection via `DeviceID` - Suitable for specialized or professional audio applications on macOS ### Settings: `OSXAudioSinkSettings` The `OSXAudioSinkBlock` requires an `OSXAudioSinkSettings` object to specify the audio output device. The `OSXAudioSinkSettings` class allows you to define: - `DeviceID`: The ID of the macOS audio output device (starting from 0) Example: ```csharp using VisioForge.Core.Types.X.Sinks; // Select the first available audio device (DeviceID = 0) var osxSettings = new OSXAudioSinkSettings { DeviceID = 0 }; // Create the macOS audio sink block var osxAudioSink = new OSXAudioSinkBlock(osxSettings); ``` ### Availability Check You can check if the `OSXAudioSinkBlock` is available on the current platform: ```csharp bool isAvailable = OSXAudioSinkBlock.IsAvailable(); ``` ### Integration Example Below is a minimal example of integrating `OSXAudioSinkBlock` into a media pipeline: ```csharp var pipeline = new MediaBlocksPipeline(); // Set up your audio source block as needed var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // Define settings for the sink var osxSettings = new OSXAudioSinkSettings { DeviceID = 0 }; var osxAudioSink = new OSXAudioSinkBlock(osxSettings); // Connect the source to the macOS audio sink pipeline.Connect(audioSourceBlock.Output, osxAudioSink.Input); await pipeline.StartAsync(); ``` ## IOSAudioSinkBlock: Direct iOS Audio Output The `IOSAudioSinkBlock` is a platform-specific media block designed for advanced scenarios where direct interaction with the iOS GStreamer audio sink is required. This block is useful for developers who need low-level control over audio output on iOS devices, such as custom audio routing, format handling, or integration with other native components. ### Key Features - Direct access to the iOS GStreamer audio sink - Fine-grained control over audio format, sample rate, and channel count - Suitable for specialized or professional audio applications on iOS ### Settings: `AudioInfoX` The `IOSAudioSinkBlock` requires an `AudioInfoX` object to specify the audio format. The `AudioInfoX` class allows you to define: - `Format`: The audio sample format (e.g., `AudioFormatX.S16LE`, `AudioFormatX.F32LE`, etc.) - `SampleRate`: The sample rate in Hz (e.g., 44100, 48000) - `Channels`: The number of audio channels (e.g., 1 for mono, 2 for stereo) Example: ```csharp using VisioForge.Core.Types.X; // Define audio format: 16-bit signed little-endian, 44100 Hz, stereo var audioInfo = new AudioInfoX(AudioFormatX.S16LE, 44100, 2); // Create the iOS audio sink block var iosAudioSink = new IOSAudioSinkBlock(audioInfo); ``` ### Availability Check You can check if the `IOSAudioSinkBlock` is available on the current platform: ```csharp bool isAvailable = IOSAudioSinkBlock.IsAvailable(); ``` ### Integration Example Below is a minimal example of integrating `IOSAudioSinkBlock` into a media pipeline: ```csharp var pipeline = new MediaBlocksPipeline(); // Set up your audio source block as needed var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // Define audio format for the sink var audioInfo = new AudioInfoX(AudioFormatX.S16LE, 44100, 2); var iosAudioSink = new IOSAudioSinkBlock(audioInfo); // Connect the source to the iOS audio sink pipeline.Connect(audioSourceBlock.Output, iosAudioSink.Input); await pipeline.StartAsync(); ``` ## Technical Specifications ### Block Information Name: AudioRendererBlock | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | uncompressed audio | 1 | ### Audio Format Support The Audio Renderer Block accepts a wide range of uncompressed audio formats: - Sample rates: 8kHz to 192kHz - Bit depths: 8-bit, 16-bit, 24-bit, and 32-bit (floating point) - Channel configurations: Mono, stereo, and multichannel (up to 7.1 surround) This flexibility allows developers to work with everything from basic voice applications to high-fidelity music and immersive audio experiences. ## Device Management ### Enumerating Available Devices The Audio Renderer Block provides straightforward methods to discover and select from available audio output devices on the system using the `GetDevicesAsync` static method: ```csharp // Get a list of all audio output devices on the current system var availableDevices = await AudioRendererBlock.GetDevicesAsync(); // Optionally specify the API to use var directSoundDevices = await AudioRendererBlock.GetDevicesAsync(AudioOutputDeviceAPI.DirectSound); // Display device information foreach (var device in availableDevices) { Console.WriteLine($"Device: {device.Name}"); } // Create a renderer with a specific device var audioRenderer = new AudioRendererBlock(availableDevices[0]); ``` ### Default Device Handling When no specific device is selected, the block automatically routes audio to the system's default output device. The no-parameter constructor attempts to select an appropriate default device based on the platform: ```csharp // Create with default device var audioRenderer = new AudioRendererBlock(); ``` The block also monitors device status, handling scenarios such as: - Device disconnection during playback - Default device changes by the user - Audio endpoint format changes ## Performance Considerations ### Latency Management Audio rendering latency is critical for many applications. The block provides configuration options through the `Settings` property and synchronization control via the `IsSync` property: ```csharp // Control synchronization behavior audioRenderer.IsSync = true; // Enable synchronization (default) // Check if a specific API is available on this platform bool isDirectSoundAvailable = AudioRendererBlock.IsAvailable(AudioOutputDeviceAPI.DirectSound); ``` ### Volume and Mute Control The AudioRendererBlock provides precise volume control and mute functionality: ```csharp // Set volume (0.0 to 1.0 range) audioRenderer.Volume = 0.8; // 80% volume // Get current volume double currentVolume = audioRenderer.Volume; // Mute/unmute audioRenderer.Mute = true; // Mute audio audioRenderer.Mute = false; // Unmute audio // Check mute state bool isMuted = audioRenderer.Mute; ``` ### Resource Utilization The Audio Renderer Block is designed for efficiency, with optimizations for: - CPU usage during playback - Memory footprint for buffer management - Power consumption on mobile devices ## Integration Examples ### Basic Pipeline Setup The following example demonstrates how to set up a simple audio rendering pipeline using a virtual audio source: ```csharp var pipeline = new MediaBlocksPipeline(); var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // Create audio renderer with default settings var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` ### Real-World Audio Pipeline For a more practical application, here's how to capture system audio and render it: ```mermaid graph LR; SystemAudioSourceBlock-->AudioRendererBlock; ``` ```csharp var pipeline = new MediaBlocksPipeline(); // Capture system audio var systemAudioSource = new SystemAudioSourceBlock(); // Configure the audio renderer var audioRenderer = new AudioRendererBlock(); audioRenderer.Volume = 0.8f; // 80% volume // Connect blocks pipeline.Connect(systemAudioSource.Output, audioRenderer.Input); // Start processing await pipeline.StartAsync(); // Allow audio to play for 10 seconds await Task.Delay(TimeSpan.FromSeconds(10)); // Stop the pipeline await pipeline.StopAsync(); ``` ## Compatibility and Platform Support The Audio Renderer Block is designed for cross-platform operation, supporting: - Windows 10 and later - macOS 10.13 and later - Linux (Ubuntu, Debian, Fedora) - iOS 12.0 and later - Android 8.0 and later This wide platform support enables developers to create consistent audio experiences across different operating systems and devices. ## Conclusion The Audio Renderer Block provides developers with a powerful, flexible solution for audio output across multiple platforms. By abstracting the complexities of platform-specific audio APIs, it allows developers to focus on creating exceptional audio experiences without worrying about the underlying implementation details. Whether building a simple media player, a professional audio editing application, or a real-time communications platform, the Audio Renderer Block provides the tools needed for high-quality, reliable audio output. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\AudioVisualizers\index.md --- title: .Net Audio Visualizer Blocks description: Explore a comprehensive set of .NET audio visualizer blocks for building powerful audio-reactive applications. Includes Spacescope, Spectrascope, Synaescope, and Wavescope. sidebar_label: Audio Visualizers --- # Audio visualizer blocks [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) VisioForge Media Blocks SDK .Net includes a set of audio visualizer blocks that allow you to create audio-reactive visualizations for your applications. These blocks take audio input and produce video output representing the audio characteristics. The blocks can be connected to other audio and video processing blocks to create complex media pipelines. Most of the blocks are available for all platforms, including Windows, Linux, MacOS, Android, and iOS. ## Spacescope The Spacescope block is a simple audio visualization element that maps the left and right audio channels to X and Y coordinates, respectively, creating a Lissajous-like pattern. This visualizes the phase relationship between the channels. The appearance, such as using dots or lines and colors, can be customized via `SpacescopeSettings`. #### Block info Name: SpacescopeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Video | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SpacescopeBlock; SpacescopeBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; // Or any audio source var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Settings can be customized, e.g., for shader, line thickness, etc. // The style (dots, lines, color-dots, color-lines) can be set in SpacescopeSettings. var spacescopeSettings = new SpacescopeSettings(); var spacescope = new SpacescopeBlock(spacescopeSettings); pipeline.Connect(fileSource.AudioOutput, spacescope.Input); // Assuming you have a VideoRendererBlock or a way to display video output var videoRenderer = new VideoRendererBlock(IntPtr.Zero); // Example for Windows pipeline.Connect(spacescope.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Spectrascope The Spectrascope block is a simple spectrum visualization element. It renders the frequency spectrum of the audio input as a series of bars. #### Block info Name: SpectrascopeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Video | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SpectrascopeBlock; SpectrascopeBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; // Or any audio source var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var spectrascope = new SpectrascopeBlock(); pipeline.Connect(fileSource.AudioOutput, spectrascope.Input); // Assuming you have a VideoRendererBlock or a way to display video output var videoRenderer = new VideoRendererBlock(IntPtr.Zero); // Example for Windows pipeline.Connect(spectrascope.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Synaescope The Synaescope block is an audio visualization element that analyzes frequencies and out-of-phase properties of the audio. It draws this analysis as dynamic clouds of stars, creating colorful and abstract patterns. #### Block info Name: SynaescopeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Video | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SynaescopeBlock; SynaescopeBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; // Or any audio source var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Settings can be customized for Synaescope. // For example, to set a specific shader effect (if available in SynaescopeSettings): // var synaescopeSettings = new SynaescopeSettings() { Shader = SynaescopeShader.LibVisualCurrent }; // var synaescope = new SynaescopeBlock(synaescopeSettings); var synaescope = new SynaescopeBlock(); // Default settings pipeline.Connect(fileSource.AudioOutput, synaescope.Input); // Assuming you have a VideoRendererBlock or a way to display video output var videoRenderer = new VideoRendererBlock(IntPtr.Zero); // Example for Windows pipeline.Connect(synaescope.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Wavescope The Wavescope block is a simple audio visualization element that renders the audio waveforms, similar to an oscilloscope display. The drawing style (dots, lines, colors) can be configured using `WavescopeSettings`. #### Block info Name: WavescopeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed audio | 1 Output | Video | 1 #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WavescopeBlock; WavescopeBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; // Or any audio source var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Settings can be customized, e.g., for style, mono/stereo mode, etc. // The style (dots, lines, color-dots, color-lines) can be set in WavescopeSettings. var wavescopeSettings = new WavescopeSettings(); var wavescope = new WavescopeBlock(wavescopeSettings); pipeline.Connect(fileSource.AudioOutput, wavescope.Input); // Assuming you have a VideoRendererBlock or a way to display video output var videoRenderer = new VideoRendererBlock(IntPtr.Zero); // Example for Windows pipeline.Connect(wavescope.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\AWS\index.md --- title: .Net Media AWS S3 Blocks Guide description: Explore a complete guide to .Net Media SDK AWS S3 source and sink blocks. Learn how to read from and write to AWS S3 for your media processing pipelines. sidebar_label: Amazon Web Services --- # AWS S3 Blocks - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) AWS S3 blocks enable interaction with Amazon Simple Storage Service (S3) to read media files as sources or write media files as sinks within your pipelines. ## AWSS3SinkBlock The `AWSS3SinkBlock` allows you to write media data from your pipeline to a file in an AWS S3 bucket. This is useful for storing recorded media, transcoded files, or other outputs directly to cloud storage. #### Block info Name: `AWSS3SinkBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input | Auto (depends on connected block) | 1 | #### Settings The `AWSS3SinkBlock` is configured using `AWSS3SinkSettings`. Key properties: - `Uri` (string): The S3 URI where the media file will be written (e.g., "s3://your-bucket-name/path/to/output/file.mp4"). - `AccessKeyId` (string): Your AWS Access Key ID. - `SecretAccessKey` (string): Your AWS Secret Access Key. - `Region` (string): The AWS region where the bucket is located (e.g., "us-east-1"). - `SessionToken` (string, optional): AWS session token, if using temporary credentials. - `EndpointUrl` (string, optional): Custom S3-compatible endpoint URL. - `ContentType` (string, optional): The MIME type of the content being uploaded (e.g., "video/mp4"). - `StorageClass` (string, optional): S3 storage class (e.g., "STANDARD", "INTELLIGENT_TIERING"). - `ServerSideEncryption` (string, optional): Server-side encryption method (e.g., "AES256", "aws:kms"). - `ACL` (string, optional): Access Control List for the uploaded object (e.g., "private", "public-read"). #### The sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->VideoEncoderBlock; VideoEncoderBlock-->MuxerBlock; SystemAudioSourceBlock-->AudioEncoderBlock; AudioEncoderBlock-->MuxerBlock; MuxerBlock-->AWSS3SinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create video source (e.g., webcam) var videoDevice = (await DeviceEnumerator.Shared.VideoSourcesAsync())[0]; var videoSourceSettings = new VideoCaptureDeviceSourceSettings(videoDevice); var videoSource = new SystemVideoSourceBlock(videoSourceSettings); // Create audio source (e.g., microphone) var audioDevice = (await DeviceEnumerator.Shared.AudioSourcesAsync())[0]; var audioSourceSettings = audioDevice.CreateSourceSettings(audioDevice.Formats[0].ToFormat()); var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // Create video encoder var h264Settings = new OpenH264EncoderSettings(); // Example encoder settings var videoEncoder = new H264EncoderBlock(h264Settings); // Create audio encoder var opusSettings = new OpusEncoderSettings(); // Example encoder settings var audioEncoder = new OpusEncoderBlock(opusSettings); // Create a muxer (e.g., MP4MuxBlock) var mp4MuxSettings = new MP4MuxSettings(); var mp4Muxer = new MP4MuxBlock(mp4MuxSettings); // Configure AWSS3SinkSettings var s3SinkSettings = new AWSS3SinkSettings { Uri = "s3://your-bucket-name/output/recorded-video.mp4", AccessKeyId = "YOUR_AWS_ACCESS_KEY_ID", SecretAccessKey = "YOUR_AWS_SECRET_ACCESS_KEY", Region = "your-aws-region", // e.g., "us-east-1" ContentType = "video/mp4" }; var s3Sink = new AWSS3SinkBlock(s3SinkSettings); // Connect video path pipeline.Connect(videoSource.Output, videoEncoder.Input); pipeline.Connect(videoEncoder.Output, mp4Muxer.CreateNewInput(MediaBlockPadMediaType.Video)); // Connect audio path pipeline.Connect(audioSource.Output, audioEncoder.Input); pipeline.Connect(audioEncoder.Output, mp4Muxer.CreateNewInput(MediaBlockPadMediaType.Audio)); // Connect muxer to S3 sink pipeline.Connect(mp4Muxer.Output, s3Sink.Input); // Check if AWSS3Sink is available if (!AWSS3SinkBlock.IsAvailable()) { Console.WriteLine("AWS S3 Sink Block is not available. Check SDK redistributables."); return; } // Start pipeline await pipeline.StartAsync(); // ... wait for recording to finish ... // Stop pipeline await pipeline.StopAsync(); ``` #### Remarks You can check if the `AWSS3SinkBlock` is available at runtime using the static method `AWSS3SinkBlock.IsAvailable()`. This ensures that the necessary underlying GStreamer plugins and AWS SDK components are present. #### Platforms Windows, macOS, Linux. (Availability depends on GStreamer AWS plugin and AWS SDK support on these platforms). ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Bridge\index.md --- title: Link Media Pipelines - Bridge Blocks Guide description: Learn to use Bridge blocks for linking and dynamically switching media pipelines for audio, video, and subtitles in .Net applications. sidebar_label: Video and Audio Bridges --- # Bridge blocks [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Bridges can be used to link two pipelines and dynamically switch between them. For example, you can switch between different files or cameras in the first Pipeline without interrupting streaming in the second Pipeline. To link source and sink, give them the same name. Each bridge pair has a unique channel name. ## Bridge audio sink and source Bridges can be used to connect different media pipelines and use them independently. `BridgeAudioSourceBlock` is used to connect to `BridgeAudioSinkBlock` and supports uncompressed audio. ### Block info #### BridgeAudioSourceBlock information | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | uncompressed audio | 1 | #### BridgeAudioSinkBlock information | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | uncompressed audio | 1 | ### Sample pipelines #### First pipeline with an audio source and a bridge audio sink ```mermaid graph LR; VirtualAudioSourceBlock-->BridgeAudioSinkBlock; ``` #### Second pipeline with a bridge audio source and an audio renderer ```mermaid graph LR; BridgeAudioSourceBlock-->AudioRendererBlock; ``` ### Sample code The source pipeline with virtual audio source and bridge audio sink. ```csharp // create source pipeline var sourcePipeline = new MediaBlocksPipeline(); // create virtual audio source and bridge audio sink var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var bridgeAudioSink = new BridgeAudioSinkBlock(new BridgeAudioSinkSettings()); // connect source and sink sourcePipeline.Connect(audioSourceBlock.Output, bridgeAudioSink.Input); // start pipeline await sourcePipeline.StartAsync(); ``` The sink pipeline with bridge audio source and audio renderer. ```csharp // create sink pipeline var sinkPipeline = new MediaBlocksPipeline(); // create bridge audio source and audio renderer var bridgeAudioSource = new BridgeAudioSourceBlock(new BridgeAudioSourceSettings()); var audioRenderer = new AudioRendererBlock(); // connect source and sink sinkPipeline.Connect(bridgeAudioSource.Output, audioRenderer.Input); // start pipeline await sinkPipeline.StartAsync(); ``` ## Bridge video sink and source Bridges can be used to connect different media pipelines and use them independently. `BridgeVideoSinkBlock` is used to connect to the `BridgeVideoSourceBlock` and supports uncompressed video. ### Blocks info #### BridgeVideoSinkBlock information | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | uncompressed video | 1 | #### BridgeVideoSourceBlock information | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | uncompressed video | 1 | ### Sample pipelines #### First pipeline with a video source and a bridge video sink ```mermaid graph LR; VirtualVideoSourceBlock-->BridgeVideoSinkBlock; ``` #### Second pipeline with a bridge video source and a video renderer ```mermaid graph LR; BridgeVideoSourceBlock-->VideoRendererBlock; ``` ### Sample code Source pipeline with a virtual video source and bridge video sink. ```csharp // create source pipeline var sourcePipeline = new MediaBlocksPipeline(); // create virtual video source and bridge video sink var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var bridgeVideoSink = new BridgeVideoSinkBlock(new BridgeVideoSinkSettings()); // connect source and sink sourcePipeline.Connect(videoSourceBlock.Output, bridgeVideoSink.Input); // start pipeline await sourcePipeline.StartAsync(); ``` Sink pipeline with a bridge video source and video renderer. ```csharp // create sink pipeline var sinkPipeline = new MediaBlocksPipeline(); // create bridge video source and video renderer var bridgeVideoSource = new BridgeVideoSourceBlock(new BridgeVideoSourceSettings()); var videoRenderer = new VideoRendererBlock(sinkPipeline, VideoView1); // connect source and sink sinkPipeline.Connect(bridgeVideoSource.Output, videoRenderer.Input); // start pipeline await sinkPipeline.StartAsync(); ``` ## Bridge subtitle sink and source Bridges can be used to connect different media pipelines and use them independently. `BridgeSubtitleSourceBlock` is used to connect to the `BridgeSubtitleSinkBlock`and supports text media type. ### Block info #### BridgeSubtitleSourceBlock information | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | text | 1 | #### BridgeSubtitleSinkBlock information | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | text | 1 | ## Proxy source Proxy source/proxy sink pair of blocks can be used to connect different media pipelines and use them independently. ### Block info Name: ProxySourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output | Any uncompressed | 1 | ### Sample pipelines #### First pipeline with a video source and a proxy video sink ```mermaid graph LR; VirtualVideoSourceBlock-->ProxySinkBlock; ``` #### Second pipeline with a proxy video source and a video renderer ```mermaid graph LR; ProxySourceBlock-->VideoRendererBlock; ``` ### Sample code ```csharp // source pipeline with virtual video source and proxy sink var sourcePipeline = new MediaBlocksPipeline(); var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var proxyVideoSink = new ProxySinkBlock(); sourcePipeline.Connect(videoSourceBlock.Output, proxyVideoSink.Input); // sink pipeline with proxy video source and video renderer var sinkPipeline = new MediaBlocksPipeline(); var proxyVideoSource = new ProxySourceBlock(proxyVideoSink); var videoRenderer = new VideoRendererBlock(sinkPipeline, VideoView1); sinkPipeline.Connect(proxyVideoSource.Output, videoRenderer.Input); // start pipelines await sourcePipeline.StartAsync(); await sinkPipeline.StartAsync(); ``` ## Platforms All bridge blocks are supported on Windows, macOS, Linux, iOS, and Android. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Decklink\index.md --- title: Blackmagic Decklink Integration for .NET Developers description: Integrate professional Blackmagic Decklink devices for high-quality audio/video capture and rendering in your .NET applications. Learn to implement SDI, HDMI inputs/outputs, configure multiple devices, and build advanced media workflows with our comprehensive code examples and API. sidebar_label: Blackmagic Decklink --- # Blackmagic Decklink Integration with Media Blocks SDK [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction to Decklink Integration The VisioForge Media Blocks SDK for .NET provides robust support for Blackmagic Decklink devices, enabling developers to implement professional-grade audio and video functionality in their applications. This integration allows for seamless capture and rendering operations using Decklink's high-quality hardware. Our SDK includes specialized blocks designed specifically for Decklink devices, giving you full access to their capabilities including SDI, HDMI, and other inputs/outputs. These blocks are optimized for performance and offer a straightforward API for implementing complex media workflows. ### Key Capabilities - **Audio Capture and Rendering**: Capture and output audio through Decklink devices - **Video Capture and Rendering**: Capture and output video in various formats and resolutions - **Multiple Device Support**: Work with multiple Decklink devices simultaneously - **Professional I/O Options**: Utilize SDI, HDMI, and other professional interfaces - **High-Quality Processing**: Maintain professional video/audio quality throughout the pipeline - **Combined Audio/Video Blocks**: Simplified handling of synchronized audio and video streams with dedicated source and sink blocks. ## System Requirements Before using the Decklink blocks, ensure your system meets these requirements: - **Hardware**: Compatible Blackmagic Decklink device - **Software**: Blackmagic Decklink SDK or drivers installed ## Decklink Block Types The SDK provides several block types for working with Decklink devices: 1. **Audio Sink Block**: For audio output to Decklink devices. 2. **Audio Source Block**: For audio capture from Decklink devices. 3. **Video Sink Block**: For video output to Decklink devices. 4. **Video Source Block**: For video capture from Decklink devices. 5. **Video + Audio Sink Block**: For synchronized video and audio output to Decklink devices using a single block. 6. **Video + Audio Source Block**: For synchronized video and audio capture from Decklink devices using a single block. Each block type is designed to handle specific media operations while maintaining synchronization and quality. ## Working with Decklink Audio Sink Block The Decklink Audio Sink block enables audio output to Blackmagic Decklink devices. This block handles the complexities of audio timing and device interfacing. ### Device Enumeration Before creating an audio sink block, you'll need to enumerate available devices: ```csharp var devices = await DecklinkAudioSinkBlock.GetDevicesAsync(); foreach (var item in devices) { Console.WriteLine($"Found device: {item.Name}, Device Number: {item.DeviceNumber}"); } ``` This code retrieves all available Decklink devices that support audio output functionality. ### Block Creation and Configuration Once you've identified the target device, you can create and configure the audio sink block: ```csharp // Get the first available device var deviceInfo = (await DecklinkAudioSinkBlock.GetDevicesAsync()).FirstOrDefault(); // Create settings for the selected device DecklinkAudioSinkSettings audioSinkSettings = null; if (deviceInfo != null) { audioSinkSettings = new DecklinkAudioSinkSettings(deviceInfo); // Example: audioSinkSettings.DeviceNumber = deviceInfo.DeviceNumber; (already set by constructor) // Further configuration: // audioSinkSettings.BufferTime = TimeSpan.FromMilliseconds(100); // audioSinkSettings.IsSync = true; } // Create the block with configured settings var decklinkAudioSink = new DecklinkAudioSinkBlock(audioSinkSettings); ``` ### Key Audio Sink Settings The `DecklinkAudioSinkSettings` class includes properties like: - `DeviceNumber`: The output device instance to use. - `BufferTime`: Minimum latency reported by the sink (default: 50ms). - `AlignmentThreshold`: Timestamp alignment threshold (default: 40ms). - `DiscontWait`: Time to wait before creating a discontinuity (default: 1s). - `IsSync`: Enables synchronization (default: true). ### Connecting to the Pipeline The audio sink block includes an `Input` pad that accepts audio data from other blocks in your pipeline: ```csharp // Example: Connect an audio source/encoder to the Decklink audio sink audioEncoder.Output.Connect(decklinkAudioSink.Input); ``` ## Working with Decklink Audio Source Block The Decklink Audio Source block enables capturing audio from Blackmagic Decklink devices. It supports various audio formats and configurations. ### Device Enumeration Enumerate available audio source devices: ```csharp var devices = await DecklinkAudioSourceBlock.GetDevicesAsync(); foreach (var item in devices) { Console.WriteLine($"Available audio source: {item.Name}, Device Number: {item.DeviceNumber}"); } ``` ### Block Creation and Configuration Create and configure the audio source block: ```csharp // Get the first available device var deviceInfo = (await DecklinkAudioSourceBlock.GetDevicesAsync()).FirstOrDefault(); // Create settings for the selected device DecklinkAudioSourceSettings audioSourceSettings = null; if (deviceInfo != null) { // create settings object audioSourceSettings = new DecklinkAudioSourceSettings(deviceInfo); // Further configuration: // audioSourceSettings.Channels = DecklinkAudioChannels.Ch2; // audioSourceSettings.Connection = DecklinkAudioConnection.Embedded; // audioSourceSettings.Format = DecklinkAudioFormat.S16LE; // SampleRate is fixed at 48000 } // Create the block with the configured settings var audioSource = new DecklinkAudioSourceBlock(audioSourceSettings); ``` ### Key Audio Source Settings The `DecklinkAudioSourceSettings` class includes properties like: - `DeviceNumber`: The input device instance to use. - `Channels`: Audio channels to capture (e.g., `DecklinkAudioChannels.Ch2`, `Ch8`, `Ch16`). Default `Ch2`. - `Format`: Audio sample format (e.g., `DecklinkAudioFormat.S16LE`). Default `S16LE`. Sample rate is fixed at 48000 Hz. - `Connection`: Audio connection type (e.g., `DecklinkAudioConnection.Embedded`, `AES`, `Analog`). Default `Auto`. - `BufferSize`: Internal buffer size in frames (default: 5). - `DisableAudioConversion`: Set to `true` to disable internal audio conversion. Default `false`. ### Connecting to the Pipeline The audio source block provides an `Output` pad that can connect to other blocks: ```csharp // Example: Connect the audio source to an audio encoder or processor audioSource.Output.Connect(audioProcessor.Input); ``` ## Working with Decklink Video Sink Block The Decklink Video Sink block enables video output to Blackmagic Decklink devices, supporting various video formats and resolutions. ### Device Enumeration Find available video sink devices: ```csharp var devices = await DecklinkVideoSinkBlock.GetDevicesAsync(); foreach (var item in devices) { Console.WriteLine($"Available video output device: {item.Name}, Device Number: {item.DeviceNumber}"); } ``` ### Block Creation and Configuration Create and configure the video sink block: ```csharp // Get the first available device var deviceInfo = (await DecklinkVideoSinkBlock.GetDevicesAsync()).FirstOrDefault(); // Create settings for the selected device DecklinkVideoSinkSettings videoSinkSettings = null; if (deviceInfo != null) { videoSinkSettings = new DecklinkVideoSinkSettings(deviceInfo); // Configure video output format and mode videoSinkSettings.Mode = DecklinkMode.HD1080i60; videoSinkSettings.VideoFormat = DecklinkVideoFormat.YUV_10bit; // Use VideoFormat // Optional: Additional configuration // videoSinkSettings.KeyerMode = DecklinkKeyerMode.Internal; // videoSinkSettings.KeyerLevel = 128; // videoSinkSettings.Profile = DecklinkProfileID.Default; // videoSinkSettings.TimecodeFormat = DecklinkTimecodeFormat.RP188Any; } // Create the block with the configured settings var decklinkVideoSink = new DecklinkVideoSinkBlock(videoSinkSettings); ``` ### Key Video Sink Settings The `DecklinkVideoSinkSettings` class includes properties like: - `DeviceNumber`: The output device instance to use. - `Mode`: Specifies the video resolution and frame rate (e.g., `DecklinkMode.HD1080i60`, `HD720p60`). Default `Unknown`. - `VideoFormat`: Defines the pixel format using `DecklinkVideoFormat` enum (e.g., `DecklinkVideoFormat.YUV_8bit`, `YUV_10bit`). Default `YUV_8bit`. - `KeyerMode`: Controls keying/compositing options using `DecklinkKeyerMode` (if supported by the device). Default `Off`. - `KeyerLevel`: Sets the keyer level (0-255). Default `255`. - `Profile`: Specifies the Decklink profile to use with `DecklinkProfileID`. - `TimecodeFormat`: Specifies the timecode format for playback using `DecklinkTimecodeFormat`. Default `RP188Any`. - `IsSync`: Enables synchronization (default: true). ## Working with Decklink Video Source Block The Decklink Video Source block allows capturing video from Blackmagic Decklink devices, supporting various input formats and resolutions. ### Device Enumeration Enumerate video capture devices: ```csharp var devices = await DecklinkVideoSourceBlock.GetDevicesAsync(); foreach (var item in devices) { Console.WriteLine($"Available video capture device: {item.Name}, Device Number: {item.DeviceNumber}"); } ``` ### Block Creation and Configuration Create and configure the video source block: ```csharp // Get the first available device var deviceInfo = (await DecklinkVideoSourceBlock.GetDevicesAsync()).FirstOrDefault(); // Create settings for the selected device DecklinkVideoSourceSettings videoSourceSettings = null; if (deviceInfo != null) { videoSourceSettings = new DecklinkVideoSourceSettings(deviceInfo); // Configure video input format and mode videoSourceSettings.Mode = DecklinkMode.HD1080i60; videoSourceSettings.Connection = DecklinkConnection.SDI; // videoSourceSettings.VideoFormat = DecklinkVideoFormat.Auto; // Often used with Mode=Auto } // Create the block with configured settings var videoSourceBlock = new DecklinkVideoSourceBlock(videoSourceSettings); ``` ### Key Video Source Settings The `DecklinkVideoSourceSettings` class includes properties like: - `DeviceNumber`: The input device instance to use. - `Mode`: Specifies the expected input resolution and frame rate (e.g., `DecklinkMode.HD1080i60`). Default `Unknown`. - `Connection`: Defines which physical input to use, using `DecklinkConnection` enum (e.g., `DecklinkConnection.HDMI`, `DecklinkConnection.SDI`). Default `Auto`. - `VideoFormat`: Specifies the video format type for input, using `DecklinkVideoFormat` enum. Default `Auto` (especially when `Mode` is `Auto`). - `Profile`: Specifies the Decklink profile using `DecklinkProfileID`. Default `Default`. - `DropNoSignalFrames`: If `true`, drops frames marked as having no input signal. Default `false`. - `OutputAFDBar`: If `true`, extracts and outputs AFD/Bar data as Meta. Default `false`. - `OutputCC`: If `true`, extracts and outputs Closed Captions as Meta. Default `false`. - `TimecodeFormat`: Specifies the timecode format using `DecklinkTimecodeFormat`. Default `RP188Any`. - `DisableVideoConversion`: Set to `true` to disable internal video conversion. Default `false`. ## Working with Decklink Video + Audio Source Block The `DecklinkVideoAudioSourceBlock` simplifies capturing synchronized video and audio streams from a single Decklink device. ### Device Enumeration and Configuration Device selection is managed through `DecklinkVideoSourceSettings` and `DecklinkAudioSourceSettings`. You would typically enumerate video devices using `DecklinkVideoSourceBlock.GetDevicesAsync()` and audio devices using `DecklinkAudioSourceBlock.GetDevicesAsync()`, then configure the respective settings objects for the chosen device. The `DecklinkVideoAudioSourceBlock` itself also provides `GetDevicesAsync()` which enumerates video sources. ```csharp // Enumerate video devices (for video part of the combined source) var videoDeviceInfo = (await DecklinkVideoAudioSourceBlock.GetDevicesAsync()).FirstOrDefault(); // or DecklinkVideoSourceBlock.GetDevicesAsync() var audioDeviceInfo = (await DecklinkAudioSourceBlock.GetDevicesAsync()).FirstOrDefault(d => d.DeviceNumber == videoDeviceInfo.DeviceNumber); // Example: match by device number DecklinkVideoSourceSettings videoSettings = null; if (videoDeviceInfo != null) { videoSettings = new DecklinkVideoSourceSettings(videoDeviceInfo); videoSettings.Mode = DecklinkMode.HD1080i60; videoSettings.Connection = DecklinkConnection.SDI; } DecklinkAudioSourceSettings audioSettings = null; if (audioDeviceInfo != null) { audioSettings = new DecklinkAudioSourceSettings(audioDeviceInfo); audioSettings.Channels = DecklinkAudioChannels.Ch2; } // Create the block with configured settings if (videoSettings != null && audioSettings != null) { var decklinkVideoAudioSource = new DecklinkVideoAudioSourceBlock(videoSettings, audioSettings); // Connect outputs // decklinkVideoAudioSource.VideoOutput.Connect(videoProcessor.Input); // decklinkVideoAudioSource.AudioOutput.Connect(audioProcessor.Input); } ``` ### Block Creation and Configuration You instantiate `DecklinkVideoAudioSourceBlock` by providing pre-configured `DecklinkVideoSourceSettings` and `DecklinkAudioSourceSettings` objects. ```csharp // Assuming videoSourceSettings and audioSourceSettings are configured as above var videoAudioSource = new DecklinkVideoAudioSourceBlock(videoSourceSettings, audioSourceSettings); ``` ### Connecting to the Pipeline The block provides separate `VideoOutput` and `AudioOutput` pads: ```csharp // Example: Connect to video and audio processors/encoders videoAudioSource.VideoOutput.Connect(videoEncoder.Input); videoAudioSource.AudioOutput.Connect(audioEncoder.Input); ``` ## Working with Decklink Video + Audio Sink Block The `DecklinkVideoAudioSinkBlock` simplifies sending synchronized video and audio streams to a single Decklink device. ### Device Enumeration and Configuration Similar to the combined source, device selection is managed via `DecklinkVideoSinkSettings` and `DecklinkAudioSinkSettings`. Enumerate devices using `DecklinkVideoSinkBlock.GetDevicesAsync()` and `DecklinkAudioSinkBlock.GetDevicesAsync()`. ```csharp var videoSinkDeviceInfo = (await DecklinkVideoSinkBlock.GetDevicesAsync()).FirstOrDefault(); var audioSinkDeviceInfo = (await DecklinkAudioSinkBlock.GetDevicesAsync()).FirstOrDefault(d => d.DeviceNumber == videoSinkDeviceInfo.DeviceNumber); // Example match DecklinkVideoSinkSettings videoSinkSettings = null; if (videoSinkDeviceInfo != null) { videoSinkSettings = new DecklinkVideoSinkSettings(videoSinkDeviceInfo); videoSinkSettings.Mode = DecklinkMode.HD1080i60; videoSinkSettings.VideoFormat = DecklinkVideoFormat.YUV_8bit; } DecklinkAudioSinkSettings audioSinkSettings = null; if (audioSinkDeviceInfo != null) { audioSinkSettings = new DecklinkAudioSinkSettings(audioSinkDeviceInfo); } // Create the block if (videoSinkSettings != null && audioSinkSettings != null) { var decklinkVideoAudioSink = new DecklinkVideoAudioSinkBlock(videoSinkSettings, audioSinkSettings); // Connect inputs // videoEncoder.Output.Connect(decklinkVideoAudioSink.VideoInput); // audioEncoder.Output.Connect(decklinkVideoAudioSink.AudioInput); } ``` ### Block Creation and Configuration Instantiate `DecklinkVideoAudioSinkBlock` with configured `DecklinkVideoSinkSettings` and `DecklinkAudioSinkSettings`. ```csharp // Assuming videoSinkSettings and audioSinkSettings are configured var videoAudioSink = new DecklinkVideoAudioSinkBlock(videoSinkSettings, audioSinkSettings); ``` ### Connecting to the Pipeline The block provides separate `VideoInput` and `AudioInput` pads: ```csharp // Example: Connect from video and audio encoders videoEncoder.Output.Connect(videoAudioSink.VideoInput); audioEncoder.Output.Connect(videoAudioSink.AudioInput); ``` ## Advanced Usage Examples ### Synchronized Audio/Video Capture **Using separate source blocks:** ```csharp // Assume videoSourceSettings and audioSourceSettings are configured for the same device/timing var videoSource = new DecklinkVideoSourceBlock(videoSourceSettings); var audioSource = new DecklinkAudioSourceBlock(audioSourceSettings); // Create an MP4 encoder var mp4Settings = new MP4SinkSettings("output.mp4"); var sink = new MP4SinkBlock(mp4Settings); // Create video encoder var videoEncoder = new H264EncoderBlock(); // Create audio encoder var audioEncoder = new AACEncoderBlock(); // Connect video and audio sources pipeline.Connect(videoSource.Output, videoEncoder.Input); pipeline.Connect(audioSource.Output, audioEncoder.Input); // Connect video encoder to sink pipeline.Connect(videoEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video)); // Connect audio encoder to sink pipeline.Connect(audioEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start the pipeline await pipeline.StartAsync(); ``` **Using `DecklinkVideoAudioSourceBlock` for simplified synchronized capture:** If you use `DecklinkVideoAudioSourceBlock` (as configured in its dedicated section), the source setup becomes: ```csharp // Assuming videoSourceSettings and audioSourceSettings are configured for the same device var videoAudioSource = new DecklinkVideoAudioSourceBlock(videoSourceSettings, audioSourceSettings); // ... (encoders and sink setup as above) ... // Connect video and audio from the combined source pipeline.Connect(videoAudioSource.VideoOutput, videoEncoder.Input); pipeline.Connect(videoAudioSource.AudioOutput, audioEncoder.Input); // ... (connect encoders to sink and start pipeline as above) ... ``` This ensures that audio and video are sourced from the Decklink device in a synchronized manner by the SDK. ## Troubleshooting Tips - **No Devices Found**: Ensure Blackmagic drivers/SDK are installed and up-to-date. Check if the device is recognized by Blackmagic Desktop Video Setup. - **Format Mismatch**: Verify the device supports your selected video/audio mode, format, and connection type. For sources with `Mode = DecklinkMode.Unknown` (auto-detect), ensure a stable signal is present. - **Performance Issues**: Check system resources (CPU, RAM, disk I/O). Consider lowering resolution/framerate if issues persist. - **Signal Detection**: For input devices, check cable connections and ensure the source device is outputting a valid signal. - **"Unable to build ...Block" errors**: Double-check that all settings are valid for the selected device and mode. Ensure the correct `DeviceNumber` is used if multiple Decklink cards are present. ## Sample Applications For complete working examples, refer to these sample applications: - [Decklink Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Decklink%20Demo) ## Conclusion The Blackmagic Decklink blocks in the VisioForge Media Blocks SDK provide a powerful and flexible way to integrate professional video and audio hardware into your .NET applications. By leveraging the specific source and sink blocks, including the combined audio/video blocks, you can efficiently implement complex capture and playback workflows. Always refer to the specific settings classes for detailed configuration options. For additional support or questions, please refer to our [documentation](https://www.visioforge.com/documentation) or contact our support team. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Demuxers\index.md --- title: .Net Media Demuxer Blocks Guide description: Explore a complete guide to .Net Media SDK demuxer blocks. Learn about MPEG-TS, QT (MP4/MOV), and Universal demuxers for your media processing pipelines. sidebar_label: Demuxers --- # Demuxer Blocks - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Demuxer blocks are essential components in media processing pipelines. They take a multimedia stream, typically from a file or network source, and separate it into its constituent elementary streams, such as video, audio, and subtitles. This allows for individual processing or rendering of each stream. VisioForge Media Blocks SDK .Net provides several demuxer blocks to handle various container formats. ## MPEG-TS Demux Block The `MPEGTSDemuxBlock` is used to demultiplex MPEG Transport Streams (MPEG-TS). MPEG-TS is a standard format for transmission and storage of audio, video, and Program and System Information Protocol (PSIP) data. It is commonly used in digital television broadcasting and streaming. ### Block info Name: `MPEGTSDemuxBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input | MPEG-TS Data | 1 | | Output video | Depends on stream content | 0 or 1+ | | Output audio | Depends on stream content | 0 or 1+ | | Output subtitle | Depends on stream content | 0 or 1+ | | Output metadata | Depends on stream content | 0 or 1+ | ### Settings The `MPEGTSDemuxBlock` is configured using `MPEGTSDemuxSettings`. Key properties of `MPEGTSDemuxSettings`: - `Latency` (`TimeSpan`): Gets or sets the latency. Default is 700 milliseconds. - `ProgramNumber` (int): Gets or sets the program number. Use -1 for default/automatic selection. ### The sample pipeline This example shows how to connect a source (like `HTTPSourceBlock` for a network stream or `UniversalSourceBlock` for a local file that outputs raw MPEG-TS data) to `MPEGTSDemuxBlock`, and then connect its outputs to respective renderer blocks. ```mermaid graph LR; DataSourceBlock -- MPEG-TS Data --> MPEGTSDemuxBlock; MPEGTSDemuxBlock -- Video Stream --> VideoRendererBlock; MPEGTSDemuxBlock -- Audio Stream --> AudioRendererBlock; MPEGTSDemuxBlock -- Subtitle Stream --> SubtitleOverlayOrRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assume 'dataSourceBlock' is a source block providing MPEG-TS data // For example, a UniversalSourceBlock reading a .ts file or an HTTP source. // var dataSourceBlock = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync("input.ts")); // For this example, let's assume dataSourceBlock.Output provides the MPEG-TS stream. var mpegTSDemuxSettings = new MPEGTSDemuxSettings(); // mpegTSDemuxSettings.ProgramNumber = 1; // Optionally select a specific program // Create MPEG-TS Demuxer Block // Constructor parameters control which streams to attempt to render var mpegTSDemuxBlock = new MPEGTSDemuxBlock( renderVideo: true, renderAudio: true, renderSubtitle: true, renderMetadata: false); // Connect the data source to the demuxer's input // pipeline.Connect(dataSourceBlock.Output, mpegTSDemuxBlock.Input); // Assuming dataSourceBlock is defined // Create renderers var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control var audioRenderer = new AudioRendererBlock(); // var subtitleRenderer = ... ; // A block to handle subtitle rendering or overlay // Connect demuxer outputs if (mpegTSDemuxBlock.VideoOutput != null) { pipeline.Connect(mpegTSDemuxBlock.VideoOutput, videoRenderer.Input); } if (mpegTSDemuxBlock.AudioOutput != null) { pipeline.Connect(mpegTSDemuxBlock.AudioOutput, audioRenderer.Input); } if (mpegTSDemuxBlock.SubtitleOutput != null) { // pipeline.Connect(mpegTSDemuxBlock.SubtitleOutput, subtitleRenderer.Input); // Connect to a subtitle handler } // Start pipeline // await pipeline.StartAsync(); // Start once dataSourceBlock is connected ``` ### Remarks - Ensure that the input to `MPEGTSDemuxBlock` is raw MPEG-TS data. If you are using a `UniversalSourceBlock` with a `.ts` file, it might already demultiplex the stream. In such cases, `MPEGTSDemuxBlock` might be used if `UniversalSourceBlock` is configured to output the raw container stream or if the stream comes from a source like `SRTRAWSourceBlock`. - The availability of video, audio, or subtitle outputs depends on the content of the MPEG-TS stream. ### Platforms Windows, macOS, Linux, iOS, Android. ## QT Demux Block (MP4/MOV) The `QTDemuxBlock` is designed to demultiplex QuickTime (QT) container formats, which include MP4 and MOV files. These formats are widely used for storing video, audio, and other multimedia content. ### Block info Name: `QTDemuxBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input | MP4/MOV Data | 1 | | Output video | Depends on stream content | 0 or 1+ | | Output audio | Depends on stream content | 0 or 1+ | | Output subtitle | Depends on stream content | 0 or 1+ | | Output metadata | Depends on stream content | 0 or 1+ | ### Settings The `QTDemuxBlock` does not have specific settings class beyond the implicit configuration through its constructor parameters (`renderVideo`, `renderAudio`, etc.). The underlying GStreamer element `qtdemux` handles the demultiplexing automatically. ### The sample pipeline This example shows how to connect a source block that outputs raw MP4/MOV data to `QTDemuxBlock`, and then connect its outputs to respective renderer blocks. ```mermaid graph LR; DataSourceBlock -- MP4/MOV Data --> QTDemuxBlock; QTDemuxBlock -- Video Stream --> VideoRendererBlock; QTDemuxBlock -- Audio Stream --> AudioRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assume 'dataSourceBlock' is a source block providing MP4/MOV data. // This could be a StreamSourceBlock feeding raw MP4 data, or a custom source. // For typical file playback, UniversalSourceBlock directly provides decoded streams. // QTDemuxBlock is used when you have the container data and need to demux it within the pipeline. // Example: var fileStream = File.OpenRead("myvideo.mp4"); // var streamSource = new StreamSourceBlock(fileStream); // StreamSourceBlock provides raw data // Create QT Demuxer Block // Constructor parameters control which streams to attempt to render var qtDemuxBlock = new QTDemuxBlock( renderVideo: true, renderAudio: true, renderSubtitle: false, renderMetadata: false); // Connect the data source to the demuxer's input // pipeline.Connect(streamSource.Output, qtDemuxBlock.Input); // Assuming streamSource is defined // Create renderers var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 var audioRenderer = new AudioRendererBlock(); // Connect demuxer outputs if (qtDemuxBlock.VideoOutput != null) { pipeline.Connect(qtDemuxBlock.VideoOutput, videoRenderer.Input); } if (qtDemuxBlock.AudioOutput != null) { pipeline.Connect(qtDemuxBlock.AudioOutput, audioRenderer.Input); } // Start pipeline // await pipeline.StartAsync(); // Start once dataSourceBlock is connected and pipeline is built ``` ### Remarks - `QTDemuxBlock` is typically used when you have a stream of MP4/MOV container data that needs to be demultiplexed within the pipeline (e.g., from a `StreamSourceBlock` or a custom data source). - For playing local MP4/MOV files, `UniversalSourceBlock` is often more convenient as it handles both demuxing and decoding. - The availability of outputs depends on the actual streams present in the MP4/MOV file. ### Platforms Windows, macOS, Linux, iOS, Android. ## Universal Demux Block The `UniversalDemuxBlock` provides a flexible way to demultiplex various media container formats based on provided settings or inferred from the input stream. It can handle formats like AVI, MKV, MP4, MPEG-TS, FLV, OGG, and WebM. This block requires `MediaFileInfo` to be provided for proper initialization of its output pads, as the number and type of streams can vary greatly between files. ### Block info Name: `UniversalDemuxBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input | Various Container Data | 1 | | Output video | Depends on stream content and `renderVideo` flag | 0 to N | | Output audio | Depends on stream content and `renderAudio` flag | 0 to N | | Output subtitle | Depends on stream content and `renderSubtitle` flag | 0 to N | | Output metadata | Depends on stream content and `renderMetadata` flag | 0 or 1 | (N is the number of respective streams in the media file) ### Settings The `UniversalDemuxBlock` is configured using an implementation of `IUniversalDemuxSettings`. The specific settings class depends on the container format you intend to demultiplex. - `UniversalDemuxerType` (enum): Specifies the type of demuxer to use. Can be `Auto`, `MKV`, `MP4`, `AVI`, `MPEGTS`, `MPEGPS`, `FLV`, `OGG`, `WebM`. - Based on the `UniversalDemuxerType`, you would create a corresponding settings object: - `AVIDemuxSettings` - `FLVDemuxSettings` - `MKVDemuxSettings` - `MP4DemuxSettings` - `MPEGPSDemuxSettings` - `MPEGTSDemuxSettings` (includes `Latency` and `ProgramNumber` properties) - `OGGDemuxSettings` - `WebMDemuxSettings` - `UniversalDemuxSettings` (for `Auto` type) The `UniversalDemuxerTypeHelper.CreateSettings(UniversalDemuxerType type)` method can be used to create the appropriate settings object. ### Constructor `UniversalDemuxBlock(IUniversalDemuxSettings settings, MediaFileInfo info, bool renderVideo = true, bool renderAudio = true, bool renderSubtitle = false, bool renderMetadata = false)` `UniversalDemuxBlock(MediaFileInfo info, bool renderVideo = true, bool renderAudio = true, bool renderSubtitle = false, bool renderMetadata = false)` (uses `UniversalDemuxSettings` for auto type detection) **Crucially, `MediaFileInfo` must be provided to the constructor.** This object, typically obtained by analyzing the media file beforehand (e.g., using `MediaInfoReader`), informs the block about the number and types of streams, allowing it to create the correct number of output pads. ### The sample pipeline This example demonstrates using `UniversalDemuxBlock` to demultiplex a file. Note that a data source block providing the raw file data to the `UniversalDemuxBlock` is implied. ```mermaid graph LR; DataSourceBlock -- Container Data --> UniversalDemuxBlock; UniversalDemuxBlock -- Video Stream 1 --> VideoRendererBlock1; UniversalDemuxBlock -- Audio Stream 1 --> AudioRendererBlock1; UniversalDemuxBlock -- Subtitle Stream 1 --> SubtitleHandler1; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // 1. Obtain MediaFileInfo for your media file var mediaInfoReader = new MediaInfoReader(Context); // Assuming Context is your logging context MediaFileInfo mediaInfo = await mediaInfoReader.GetInfoAsync("path/to/your/video.mkv"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } // 2. Choose or create Demuxer Settings // Example: Auto-detect demuxer type IUniversalDemuxSettings demuxSettings = new UniversalDemuxSettings(); // Or, specify a type, e.g., for an MKV file: // IUniversalDemuxSettings demuxSettings = new MKVDemuxSettings(); // Or, for MPEG-TS with specific program: // var mpegTsSettings = new MPEGTSDemuxSettings { ProgramNumber = 1 }; // IUniversalDemuxSettings demuxSettings = mpegTsSettings; // 3. Create UniversalDemuxBlock var universalDemuxBlock = new UniversalDemuxBlock( demuxSettings, mediaInfo, renderVideo: true, // Process video streams renderAudio: true, // Process audio streams renderSubtitle: true // Process subtitle streams ); // 4. Connect a data source that provides the raw file stream to UniversalDemuxBlock's input. // This step is crucial and depends on how you get the file data. // For instance, using a FileSource configured to output raw data, or a StreamSourceBlock. // Example with a hypothetical RawFileSourceBlock (not a standard block, for illustration): // var rawFileSource = new RawFileSourceBlock("path/to/your/video.mkv"); // pipeline.Connect(rawFileSource.Output, universalDemuxBlock.Input); // 5. Connect outputs // Video outputs (MediaBlockPad[]) var videoOutputs = universalDemuxBlock.VideoOutputs; if (videoOutputs.Length > 0) { // Example: connect the first video stream var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 pipeline.Connect(videoOutputs[0], videoRenderer.Input); } // Audio outputs (MediaBlockPad[]) var audioOutputs = universalDemuxBlock.AudioOutputs; if (audioOutputs.Length > 0) { // Example: connect the first audio stream var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioOutputs[0], audioRenderer.Input); } // Subtitle outputs (MediaBlockPad[]) var subtitleOutputs = universalDemuxBlock.SubtitleOutputs; if (subtitleOutputs.Length > 0) { // Example: connect the first subtitle stream to a conceptual handler // var subtitleHandler = new MySubtitleHandlerBlock(); // pipeline.Connect(subtitleOutputs[0], subtitleHandler.Input); } // Metadata output (if renderMetadata was true and metadata stream exists) var metadataOutputs = universalDemuxBlock.MetadataOutputs; if (metadataOutputs.Length > 0 && metadataOutputs[0] != null) { // Handle metadata stream } // Start pipeline after all connections are made // await pipeline.StartAsync(); ``` ### Remarks - **`MediaFileInfo` is mandatory** for `UniversalDemuxBlock` to correctly initialize its output pads based on the streams present in the file. - The `renderVideo`, `renderAudio`, and `renderSubtitle` flags in the constructor determine if outputs for these stream types will be created and processed. If set to `false`, respective streams will be ignored (or sent to internal null renderers if present in the file but not rendered). - The `UniversalDemuxBlock` is powerful for scenarios where you need to explicitly manage the demuxing process for various formats or select specific streams from files with multiple tracks. - For simple playback of common file formats, `UniversalSourceBlock` often provides a more straightforward solution as it integrates demuxing and decoding. `UniversalDemuxBlock` offers more granular control. ### Platforms Windows, macOS, Linux, iOS, Android. (Platform support for specific formats may depend on underlying GStreamer plugins.) ---END OF PAGE--- # Local File: .\dotnet\mediablocks\GettingStarted\camera.md --- title: Creating Camera Applications with Media Blocks SDK description: Learn how to build powerful camera viewing applications with Media Blocks SDK .Net. This step-by-step tutorial covers device enumeration, format selection, camera configuration, pipeline creation, and video rendering for desktop and mobile platforms. sidebar_label: Camera Applications --- # Building Camera Applications with Media Blocks SDK [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction This comprehensive guide demonstrates how to create a fully functional camera viewing application using the Media Blocks SDK .Net. The SDK provides a robust framework for capturing, processing, and displaying video streams across multiple platforms including Windows, macOS, iOS, and Android. ## Architecture Overview To create a camera viewer application, you'll need to understand two fundamental components: 1. **System Video Source** - Captures the video stream from connected camera devices 2. **Video Renderer** - Displays the captured video on screen with configurable settings These components work together within a pipeline architecture that manages media processing. ## Essential Media Blocks To build a camera application, you need to add the following blocks to your pipeline: - **[System Video Source Block](../Sources/index.md)** - Connects to and reads from camera devices - **[Video Renderer Block](../VideoRendering/index.md)** - Displays the video with configurable rendering options ## Setting Up the Pipeline ### Creating the Base Pipeline First, create a pipeline object that will manage the media flow: ```csharp using VisioForge.Core.MediaBlocks; // Initialize the pipeline var pipeline = new MediaBlocksPipeline(); // Add error handling pipeline.OnError += (sender, args) => { Console.WriteLine($"Pipeline error: {args.Message}"); }; ``` ### Camera Device Enumeration Before adding a camera source, you need to enumerate the available devices and select one: ```csharp // Get all available video devices asynchronously var videoDevices = await DeviceEnumerator.Shared.VideoSourcesAsync(); // Display available devices (useful for user selection) foreach (var device in videoDevices) { Console.WriteLine($"Device: {device.Name} [{device.API}]"); } // Select the first available device var selectedDevice = videoDevices[0]; ``` ### Camera Format Selection Each camera supports different resolutions and frame rates. You can enumerate and select the optimal format: ```csharp // Display available formats for the selected device foreach (var format in selectedDevice.VideoFormats) { Console.WriteLine($"Format: {format.Width}x{format.Height} {format.Format}"); // Display available frame rates for this format foreach (var frameRate in format.FrameRateList) { Console.WriteLine($" Frame Rate: {frameRate}"); } } // Select the optimal format (in this example, we look for HD resolution) var hdFormat = selectedDevice.GetHDVideoFormatAndFrameRate(out var frameRate); var formatToUse = hdFormat ?? selectedDevice.VideoFormats[0]; ``` ## Configuring Camera Settings ### Creating Source Settings Configure the camera source settings with your selected device and format: ```csharp // Create camera settings with the selected device and format var videoSourceSettings = new VideoCaptureDeviceSourceSettings(selectedDevice) { Format = formatToUse.ToFormat() }; // Set the desired frame rate (selecting the highest available) if (formatToUse.FrameRateList.Count > 0) { videoSourceSettings.Format.FrameRate = formatToUse.FrameRateList.Max(); } // Optional: Enable force frame rate to maintain consistent timing videoSourceSettings.Format.ForceFrameRate = true; // Platform-specific settings #if __ANDROID__ // Android-specific settings videoSourceSettings.VideoStabilization = true; #elif __IOS__ && !__MACCATALYST__ // iOS-specific settings videoSourceSettings.Position = IOSVideoSourcePosition.Back; videoSourceSettings.Orientation = IOSVideoSourceOrientation.Portrait; #endif ``` ### Creating the Video Source Block Now create the system video source block with your configured settings: ```csharp // Create the video source block var videoSource = new SystemVideoSourceBlock(videoSourceSettings); ``` ## Setting Up Video Display ### Creating the Video Renderer Add a video renderer to display the captured video: ```csharp // Create the video renderer and connect it to your UI component var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Optional: Configure renderer settings videoRenderer.Settings.IsSync = true; ``` ### Advanced Renderer Configuration For more control over video rendering, you can customize renderer settings: ```csharp // Enable snapshot capabilities videoRenderer.Settings.EnableSnapshot = true; // Configure subtitle overlay if needed videoRenderer.SubtitleEnabled = false; ``` ## Connecting the Pipeline Connect the video source to the renderer to establish the media flow: ```csharp // Connect the output of the video source to the input of the renderer pipeline.Connect(videoSource.Output, videoRenderer.Input); ``` ## Managing the Pipeline Lifecycle ### Starting the Pipeline Start the pipeline to begin capturing and displaying video: ```csharp // Start the pipeline asynchronously await pipeline.StartAsync(); ``` ### Taking Snapshots Capture still images from the video stream: ```csharp // Take a snapshot and save it as a JPEG file await videoRenderer.Snapshot_SaveAsync("camera_snapshot.jpg", SkiaSharp.SKEncodedImageFormat.Jpeg, 90); // Or get the snapshot as a bitmap for further processing var bitmap = await videoRenderer.Snapshot_GetAsync(); ``` ### Stopping the Pipeline When finished, properly stop the pipeline: ```csharp // Stop the pipeline asynchronously await pipeline.StopAsync(); ``` ## Platform-Specific Considerations The Media Blocks SDK supports cross-platform development with specific optimizations: - **Windows**: Supports both Media Foundation and Kernel Streaming APIs - **macOS/iOS**: Utilizes AVFoundation for optimal performance - **Android**: Provides access to camera features like stabilization and orientation ## Error Handling and Troubleshooting Implement proper error handling to ensure a stable application: ```csharp try { // Pipeline operations await pipeline.StartAsync(); } catch (Exception ex) { Console.WriteLine($"Error starting pipeline: {ex.Message}"); // Handle the exception appropriately } ``` ## Complete Implementation Example This example demonstrates a complete camera viewer implementation: ```csharp using System; using System.Linq; using System.Threading.Tasks; using VisioForge.Core.MediaBlocks; using VisioForge.Core.MediaBlocks.Sources; using VisioForge.Core.MediaBlocks.VideoRendering; using VisioForge.Core.Types.X.Sources; public class CameraViewerExample { private MediaBlocksPipeline _pipeline; private SystemVideoSourceBlock _videoSource; private VideoRendererBlock _videoRenderer; public async Task InitializeAsync(IVideoView videoView) { // Create pipeline _pipeline = new MediaBlocksPipeline(); _pipeline.OnError += (s, e) => Console.WriteLine(e.Message); // Enumerate devices var devices = await DeviceEnumerator.Shared.VideoSourcesAsync(); if (devices.Length == 0) { throw new Exception("No camera devices found"); } // Select device and format var device = devices[0]; var format = device.GetHDOrAnyVideoFormatAndFrameRate(out var frameRate); // Create settings var settings = new VideoCaptureDeviceSourceSettings(device); if (format != null) { settings.Format = format.ToFormat(); if (frameRate != null && !frameRate.IsEmpty) { settings.Format.FrameRate = frameRate; } } // Create blocks _videoSource = new SystemVideoSourceBlock(settings); _videoRenderer = new VideoRendererBlock(_pipeline, videoView); // Build pipeline _pipeline.AddBlock(_videoSource); _pipeline.AddBlock(_videoRenderer); _pipeline.Connect(_videoSource.Output, _videoRenderer.Input); // Start pipeline await _pipeline.StartAsync(); } public async Task StopAsync() { if (_pipeline != null) { await _pipeline.StopAsync(); _pipeline.Dispose(); } } public async Task TakeSnapshotAsync(string filename) { return await _videoRenderer.Snapshot_SaveAsync(filename, SkiaSharp.SKEncodedImageFormat.Jpeg, 90); } } ``` ## Conclusion With Media Blocks SDK .Net, building powerful camera applications becomes straightforward. The component-based architecture provides flexibility and performance across platforms while abstracting the complexities of camera device integration. For complete source code examples, please visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo). ---END OF PAGE--- # Local File: .\dotnet\mediablocks\GettingStarted\device-enum.md --- title: Complete Guide to Media Device Enumeration in .NET description: Learn how to efficiently enumerate video cameras, audio inputs/outputs, Blackmagic Decklink devices, NDI sources, and GenICam/GigE Vision cameras in your .NET applications using the Media Blocks SDK. This tutorial provides practical code examples for device discovery and integration. sidebar_label: Device Enumeration order: 0 --- # Complete Guide to Media Device Enumeration in .NET [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The Media Blocks SDK provides a powerful and efficient way to discover and work with various media devices in your .NET applications. This guide will walk you through the process of enumerating different types of media devices using the SDK's `DeviceEnumerator` class. ## Introduction to Device Enumeration Device enumeration is a critical first step when developing applications that interact with media hardware. The `DeviceEnumerator` class provides a centralized way to detect and list all available media devices connected to your system. The SDK uses a singleton pattern for device enumeration, making it easy to access the functionality from anywhere in your code: ```csharp // Access the shared DeviceEnumerator instance var enumerator = DeviceEnumerator.Shared; ``` ## Discovering Video Input Devices ### Standard Video Sources To list all available video input devices (webcams, capture cards, virtual cameras): ```csharp var videoSources = await DeviceEnumerator.Shared.VideoSourcesAsync(); foreach (var device in videoSources) { Debug.WriteLine($"Video device found: {device.Name}"); // You can access additional properties here if needed } ``` The `VideoCaptureDeviceInfo` objects returned provide detailed information about each device, including device name, internal identifiers, and API type. ## Working with Audio Devices ### Enumerating Audio Input Sources To discover microphones and other audio input devices: ```csharp var audioSources = await DeviceEnumerator.Shared.AudioSourcesAsync(); foreach (var device in audioSources) { Debug.WriteLine($"Audio input device found: {device.Name}"); // Additional device information can be accessed here } ``` You can also filter audio devices by their API type: ```csharp // Get only audio sources for a specific API var audioSources = await DeviceEnumerator.Shared.AudioSourcesAsync(AudioCaptureDeviceAPI.DirectSound); ``` ### Finding Audio Output Devices For speakers, headphones, and other audio output destinations: ```csharp var audioOutputs = await DeviceEnumerator.Shared.AudioOutputsAsync(); foreach (var device in audioOutputs) { Debug.WriteLine($"Audio output device found: {device.Name}"); // Process device information as needed } ``` Similar to audio sources, you can filter outputs by API: ```csharp // Get only audio outputs for a specific API var audioOutputs = await DeviceEnumerator.Shared.AudioOutputsAsync(AudioOutputDeviceAPI.DirectSound); ``` ## Professional Blackmagic Decklink Integration ### Decklink Video Input Sources For professional video workflows using Blackmagic hardware: ```csharp var decklinkVideoSources = await DeviceEnumerator.Shared.DecklinkVideoSourcesAsync(); foreach (var device in decklinkVideoSources) { Debug.WriteLine($"Decklink video input: {device.Name}"); // You can work with specific Decklink properties here } ``` ### Decklink Audio Input Sources To access audio channels from Decklink devices: ```csharp var decklinkAudioSources = await DeviceEnumerator.Shared.DecklinkAudioSourcesAsync(); foreach (var device in decklinkAudioSources) { Debug.WriteLine($"Decklink audio input: {device.Name}"); // Process Decklink audio device information } ``` ### Decklink Video Output Destinations For sending video to Decklink output devices: ```csharp var decklinkVideoOutputs = await DeviceEnumerator.Shared.DecklinkVideoSinksAsync(); foreach (var device in decklinkVideoOutputs) { Debug.WriteLine($"Decklink video output: {device.Name}"); // Access output device properties as needed } ``` ### Decklink Audio Output Destinations For routing audio to Decklink hardware outputs: ```csharp var decklinkAudioOutputs = await DeviceEnumerator.Shared.DecklinkAudioSinksAsync(); foreach (var device in decklinkAudioOutputs) { Debug.WriteLine($"Decklink audio output: {device.Name}"); // Work with audio output configuration here } ``` ## Network Device Integration ### NDI Sources Discovery To find NDI sources available on your network: ```csharp var ndiSources = await DeviceEnumerator.Shared.NDISourcesAsync(); foreach (var device in ndiSources) { Debug.WriteLine($"NDI source discovered: {device.Name}"); // Process NDI-specific properties and information } ``` ### ONVIF Network Camera Discovery To find IP cameras supporting the ONVIF protocol: ```csharp // Set a timeout for discovery (2 seconds in this example) var timeout = TimeSpan.FromSeconds(2); var onvifDevices = await DeviceEnumerator.Shared.ONVIF_ListSourcesAsync(timeout, null); foreach (var deviceUri in onvifDevices) { Debug.WriteLine($"ONVIF camera found at: {deviceUri}"); // Connect to the camera using the discovered URI } ``` ## Industrial Camera Support ### Basler Industrial Cameras For applications requiring Basler industrial cameras: ```csharp var baslerCameras = await DeviceEnumerator.Shared.BaslerSourcesAsync(); foreach (var device in baslerCameras) { Debug.WriteLine($"Basler camera detected: {device.Name}"); // Access Basler-specific camera features } ``` ### Allied Vision Industrial Cameras To work with Allied Vision cameras in your application: ```csharp var alliedCameras = await DeviceEnumerator.Shared.AlliedVisionSourcesAsync(); foreach (var device in alliedCameras) { Debug.WriteLine($"Allied Vision camera found: {device.Name}"); // Configure Allied Vision specific parameters } ``` ### Spinnaker SDK Compatible Cameras For cameras supporting the Spinnaker SDK (Windows only): ```csharp #if NET_WINDOWS var spinnakerCameras = await DeviceEnumerator.Shared.SpinnakerSourcesAsync(); foreach (var device in spinnakerCameras) { Debug.WriteLine($"Spinnaker SDK camera: {device.Name}"); Debug.WriteLine($"Model: {device.Model}, Vendor: {device.Vendor}"); Debug.WriteLine($"Resolution: {device.WidthMax}x{device.HeightMax}"); // Work with camera-specific properties } #endif ``` ### Generic GenICam Standard Cameras For other industrial cameras supporting the GenICam standard: ```csharp var genicamCameras = await DeviceEnumerator.Shared.GenICamSourcesAsync(); foreach (var device in genicamCameras) { Debug.WriteLine($"GenICam compatible device: {device.Name}"); Debug.WriteLine($"Model: {device.Model}, Vendor: {device.Vendor}"); Debug.WriteLine($"Protocol: {device.Protocol}, Serial: {device.SerialNumber}"); // Work with standard GenICam features } ``` ## Device Monitoring The SDK also supports monitoring device connections and disconnections: ```csharp // Start monitoring for video device changes await DeviceEnumerator.Shared.StartVideoSourceMonitorAsync(); // Start monitoring for audio device changes await DeviceEnumerator.Shared.StartAudioSourceMonitorAsync(); await DeviceEnumerator.Shared.StartAudioSinkMonitorAsync(); // Subscribe to device change events DeviceEnumerator.Shared.OnVideoSourceAdded += (sender, device) => { Debug.WriteLine($"New video device connected: {device.Name}"); }; DeviceEnumerator.Shared.OnVideoSourceRemoved += (sender, device) => { Debug.WriteLine($"Video device disconnected: {device.Name}"); }; ``` ## Platform-Specific Considerations ### Windows On Windows, the SDK can detect USB device connection and removal events at the system level: ```csharp #if NET_WINDOWS // Subscribe to system-wide device events DeviceEnumerator.Shared.OnDeviceAdded += (sender, args) => { // Refresh device lists when new hardware is connected RefreshDeviceLists(); }; DeviceEnumerator.Shared.OnDeviceRemoved += (sender, args) => { // Update UI when hardware is disconnected RefreshDeviceLists(); }; #endif ``` By default, Media Foundation device enumeration is disabled to avoid duplication with DirectShow devices. You can enable it if needed: ```csharp #if NET_WINDOWS // Enable Media Foundation device enumeration if required DeviceEnumerator.Shared.IsEnumerateMediaFoundationDevices = true; #endif ``` ### iOS and Android On mobile platforms, the SDK handles the required permission requests when enumerating devices: ```csharp #if __IOS__ || __ANDROID__ // This will automatically request camera permissions if needed var videoSources = await DeviceEnumerator.Shared.VideoSourcesAsync(); // This will automatically request microphone permissions if needed var audioSources = await DeviceEnumerator.Shared.AudioSourcesAsync(); #endif ``` ## Best Practices for Device Enumeration When working with device enumeration in production applications: 1. Always handle cases where no devices are found 2. Consider caching device lists when appropriate to improve performance 3. Implement proper exception handling for device access failures 4. Provide clear user feedback when required devices are missing 5. Use the async methods to avoid blocking the UI thread during enumeration 6. Clean up resources by calling `Dispose()` when you're done with the DeviceEnumerator ```csharp // Proper cleanup when done DeviceEnumerator.Shared.Dispose(); ``` ---END OF PAGE--- # Local File: .\dotnet\mediablocks\GettingStarted\index.md --- title: Media Blocks SDK .Net - Developer Quick Start Guide description: Learn to integrate Media Blocks SDK .Net into your applications with our detailed tutorial. From installation to implementation, discover how to create powerful multimedia pipelines, process video streams, and build robust media applications. sidebar_label: Getting Started order: 20 --- # Media Blocks SDK .Net - Developer Quick Start Guide [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction This guide provides a comprehensive walkthrough for integrating the Media Blocks SDK .Net into your applications. The SDK is built around a modular pipeline architecture, enabling you to create, connect, and manage multimedia processing blocks for video, audio, and more. Whether you're building video processing tools, streaming solutions, or multimedia applications, this guide will help you get started quickly and correctly. ## SDK Installation Process The SDK is distributed as a NuGet package for easy integration into your .Net projects. Install it using: ```bash dotnet add package VisioForge.DotNet.MediaBlocks ``` For platform-specific requirements and additional installation details, refer to the [detailed installation guide](../../install/index.md). ## Core Concepts and Architecture ### MediaBlocksPipeline - The central class for managing the flow of media data between processing blocks. - Handles block addition, connection, state management, and event handling. - Implements `IMediaBlocksPipeline` and exposes events such as `OnError`, `OnStart`, `OnPause`, `OnResume`, `OnStop`, and `OnLoop`. ### MediaBlock and Interfaces - Each processing unit is a `MediaBlock` (or a derived class), implementing the `IMediaBlock` interface. - Key interfaces: - `IMediaBlock`: Base interface for all blocks. Defines properties for `Name`, `Type`, `Input`, `Inputs`, `Output`, `Outputs`, and methods for pipeline context and YAML export. - `IMediaBlockDynamicInputs`: For blocks that support dynamic input creation (e.g., mixers). - `IMediaBlockInternals`/`IMediaBlockInternals2`: For internal pipeline management, building, and post-connection logic. - `IMediaBlockRenderer`: For blocks that render media (e.g., video/audio renderers), with a property to control stream synchronization. - `IMediaBlockSink`/`IMediaBlockSource`: For blocks that act as sinks (outputs) or sources (inputs). - `IMediaBlockSettings`: For settings objects that can create blocks. ### Pads and Media Types - Blocks are connected via `MediaBlockPad` objects, which have a direction (`In`/`Out`) and a media type (`Video`, `Audio`, `Subtitle`, `Metadata`, `Auto`). - Pads can be connected/disconnected, and their state can be queried. ### Block Types - The SDK provides a wide range of built-in block types (see `MediaBlockType` enum in the source code) for sources, sinks, renderers, effects, and more. ## Creating and Managing a Pipeline ### 1. Initialize the SDK (if required) ```csharp using VisioForge.Core; // Initialize the SDK at application startup VisioForgeX.InitSDK(); ``` ### 2. Create a Pipeline and Blocks ```csharp using VisioForge.Core.MediaBlocks; // Create a new pipeline instance var pipeline = new MediaBlocksPipeline(); // Example: Create a virtual video source and a video renderer var virtualSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // VideoView1 is your UI control // Add blocks to the pipeline pipeline.AddBlock(virtualSource); pipeline.AddBlock(videoRenderer); ``` ### 3. Connect Blocks ```csharp // Connect the output of the source to the input of the renderer pipeline.Connect(virtualSource.Output, videoRenderer.Input); ``` - You can also use `pipeline.Connect(sourceBlock, targetBlock)` to connect default pads, or connect multiple pads for complex graphs. - For blocks supporting dynamic inputs, use the `IMediaBlockDynamicInputs` interface. ### 4. Start and Stop the Pipeline ```csharp // Start the pipeline asynchronously await pipeline.StartAsync(); // ... later, stop processing await pipeline.StopAsync(); ``` ### 5. Resource Cleanup ```csharp // Dispose of the pipeline when done pipeline.Dispose(); ``` ### 6. SDK Cleanup (if required) ```csharp // Release all SDK resources at application shutdown VisioForgeX.DestroySDK(); ``` ## Error Handling and Events - Subscribe to pipeline events for robust error and state management: ```csharp pipeline.OnError += (sender, args) => { Console.WriteLine($"Pipeline error: {args.Message}"); // Implement your error handling logic here }; pipeline.OnStart += (sender, args) => { Console.WriteLine("Pipeline started"); }; pipeline.OnStop += (sender, args) => { Console.WriteLine("Pipeline stopped"); }; ``` ## Advanced Features - **Dynamic Block Addition/Removal:** You can add or remove blocks at runtime as needed. - **Pad Management:** Use `MediaBlockPad` methods to query and manage pad connections. - **Hardware/Software Decoder Selection:** Use helper methods in `MediaBlocksPipeline` for hardware acceleration. - **Segment Playback:** Set `StartPosition` and `StopPosition` properties for partial playback. - **Debugging:** Export pipeline graphs for debugging using provided methods. ## Example: Minimal Pipeline Setup ```csharp using VisioForge.Core.MediaBlocks; var pipeline = new MediaBlocksPipeline(); var source = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var renderer = new VideoRendererBlock(pipeline, videoViewControl); pipeline.AddBlock(source); pipeline.AddBlock(renderer); pipeline.Connect(source.Output, renderer.Input); await pipeline.StartAsync(); // ... await pipeline.StopAsync(); pipeline.Dispose(); ``` ## Reference: Key Interfaces - `IMediaBlock`: Base interface for all blocks. - `IMediaBlockDynamicInputs`: For blocks with dynamic input support. - `IMediaBlockInternals`, `IMediaBlockInternals2`: For internal pipeline logic. - `IMediaBlockRenderer`: For renderer blocks. - `IMediaBlockSink`, `IMediaBlockSource`: For sink/source blocks. - `IMediaBlockSettings`: For block settings objects. - `IMediaBlocksPipeline`: Main pipeline interface. - `MediaBlockPad`, `MediaBlockPadDirection`, `MediaBlockPadMediaType`: For pad management. ## Further Reading and Samples - [Complete Pipeline Implementation](pipeline.md) - [Media Player Development Guide](player.md) - [Camera Viewer Application Tutorial](camera.md) - [GitHub repository with code samples](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK) For a full list of block types and advanced usage, consult the SDK API reference and source code. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\GettingStarted\pipeline.md --- title: Media Blocks Pipeline Core for Media Processing description: Discover how to efficiently utilize the Media Blocks Pipeline to create powerful media applications for video playback, recording, and streaming. Learn essential pipeline operations including creation, block connections, error handling, and proper resource management. sidebar_label: Pipeline Core Usage order: 0 --- # Media Blocks Pipeline: Core Functionality [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Overview of Pipeline and Block Structure The Media Blocks SDK is built around the `MediaBlocksPipeline` class, which manages a collection of modular processing blocks. Each block implements the `IMediaBlock` interface or one of its specialized variants. Blocks are connected via input and output pads, allowing for flexible media processing chains. ### Main Block Interfaces - **IMediaBlock**: Base interface for all blocks. Exposes properties for name, type, input/output pads, and methods for YAML conversion and pipeline context retrieval. - **IMediaBlockDynamicInputs**: For blocks (like muxers) that can create new inputs dynamically. Methods: `CreateNewInput(mediaType)` and `GetInput(mediaType)`. - **IMediaBlockInternals**: Internal methods for pipeline integration (e.g., `SetContext`, `Build`, `CleanUp`, `GetElement`, `GetCore`). - **IMediaBlockInternals2**: For post-connection logic (`PostConnect()`). - **IMediaBlockRenderer**: For renderer blocks, exposes `IsSync` property. - **IMediaBlockSettings**: For settings/configuration objects that can create a block (`CreateBlock()`). - **IMediaBlockSink**: For sink blocks, exposes filename/URL getter/setter. - **IMediaBlockSource**: For source blocks (currently only commented-out pad accessors). ### Pads and Media Types - **MediaBlockPad**: Represents a connection point (input/output) on a block. Has direction (`In`/`Out`), media type (`Video`, `Audio`, `Subtitle`, `Metadata`, `Auto`), and connection logic. - **Pad connection**: Use `pipeline.Connect(outputPad, inputPad)` or `pipeline.Connect(block1.Output, block2.Input)`. For dynamic inputs, use `CreateNewInput()` on the sink block. ## Setting Up Your Pipeline Environment ### Creating a New Pipeline Instance The first step in working with Media Blocks is instantiating a pipeline object: ```csharp using VisioForge.Core.MediaBlocks; // Create a standard pipeline instance var pipeline = new MediaBlocksPipeline(); // Optionally, you can assign a name to your pipeline for easier identification pipeline.Name = "MainVideoPlayer"; ``` ### Implementing Robust Error Handling Media applications must handle various error scenarios that may occur during operation. Implementing proper error handling ensures your application remains stable: ```csharp // Subscribe to error events to capture and handle exceptions pipeline.OnError += (sender, args) => { // Log the error message Debug.WriteLine($"Pipeline error occurred: {args.Message}"); // Implement appropriate error recovery based on the message if (args.Message.Contains("Access denied")) { // Handle permission issues } else if (args.Message.Contains("File not found")) { // Handle missing file errors } }; ``` ## Managing Media Timing and Navigation ### Retrieving Duration and Position Information Accurate timing control is essential for media applications: ```csharp // Get the total duration of the media (returns TimeSpan.Zero for live streams) var duration = await pipeline.DurationAsync(); Console.WriteLine($"Media duration: {duration.TotalSeconds} seconds"); // Get the current playback position var position = await pipeline.Position_GetAsync(); Console.WriteLine($"Current position: {position.TotalSeconds} seconds"); ``` ### Implementing Seeking Functionality Enable your users to navigate through media content with seeking operations: ```csharp // Basic seeking to a specific time position await pipeline.Position_SetAsync(TimeSpan.FromSeconds(10)); // Seeking with keyframe alignment for more efficient navigation await pipeline.Position_SetAsync(TimeSpan.FromMinutes(2), seekToKeyframe: true); // Advanced seeking with start and stop positions for partial playback await pipeline.Position_SetRangeAsync( TimeSpan.FromSeconds(30), // Start position TimeSpan.FromSeconds(60) // Stop position ); ``` ## Controlling Pipeline Execution Flow ### Starting Media Playback Control the playback of media with these essential methods: ```csharp // Start playback immediately await pipeline.StartAsync(); // Preload media without starting playback (useful for reducing startup delay) await pipeline.StartAsync(onlyPreload: true); await pipeline.ResumeAsync(); // Start the preloaded pipeline when ready ``` ### Managing Playback States Monitor and control the pipeline's current execution state: ```csharp // Check the current state of the pipeline var state = pipeline.State; if (state == PlaybackState.Play) { Console.WriteLine("Pipeline is currently playing"); } // Subscribe to important state change events pipeline.OnStart += (sender, args) => { Console.WriteLine("Pipeline playback has started"); UpdateUIForPlaybackState(); }; pipeline.OnStop += (sender, args) => { Console.WriteLine("Pipeline playback has stopped"); Console.WriteLine($"Stopped at position: {args.Position.TotalSeconds} seconds"); ResetPlaybackControls(); }; pipeline.OnPause += (sender, args) => { Console.WriteLine("Pipeline playback is paused"); UpdatePauseButtonState(); }; pipeline.OnResume += (sender, args) => { Console.WriteLine("Pipeline playback has resumed"); UpdatePlayButtonState(); }; ``` ### Pausing and Resuming Operations Implement pause and resume functionality for better user experience: ```csharp // Pause the current playback await pipeline.PauseAsync(); // Resume playback from paused state await pipeline.ResumeAsync(); ``` ### Stopping Pipeline Execution Properly terminate pipeline operations: ```csharp // Standard stop operation await pipeline.StopAsync(); // Force stop in time-sensitive scenarios (may affect output file integrity) await pipeline.StopAsync(force: true); ``` ## Building Media Processing Chains ### Connecting Media Processing Blocks The true power of the Media Blocks SDK comes from connecting specialized blocks to create processing chains: ```csharp // Basic connection between two blocks pipeline.Connect(block1.Output, block2.Input); // Connect blocks with specific media types pipeline.Connect(videoSource.GetOutputPadByType(MediaBlockPadMediaType.Video), videoEncoder.GetInputPadByType(MediaBlockPadMediaType.Video)); ``` Different blocks may have multiple specialized inputs and outputs: - Standard I/O: `Input` and `Output` properties - Media-specific I/O: `VideoOutput`, `AudioOutput`, `VideoInput`, `AudioInput` - Arrays of I/O: `Inputs[]` and `Outputs[]` for complex blocks ### Working with Dynamic Input Blocks Some advanced sink blocks dynamically create inputs on demand: ```csharp // Create a specialized MP4 muxer for recording var mp4Muxer = new MP4SinkBlock(); mp4Muxer.FilePath = "output_recording.mp4"; // Request a new video input from the muxer var videoInput = mp4Muxer.CreateNewInput(MediaBlockPadMediaType.Video); // Connect a video source to the newly created input pipeline.Connect(videoSource.Output, videoInput); // Similarly for audio var audioInput = mp4Muxer.CreateNewInput(MediaBlockPadMediaType.Audio); pipeline.Connect(audioSource.Output, audioInput); ``` This flexibility enables complex media processing scenarios with multiple input streams. ## Proper Resource Management ### Disposing Pipeline Resources Media applications can consume significant system resources. Always properly dispose of pipeline objects: ```csharp // Synchronous disposal pattern try { // Use pipeline } finally { pipeline.Dispose(); } ``` For modern applications, use the asynchronous pattern to prevent UI freezing: ```csharp // Asynchronous disposal (preferred for UI applications) try { // Use pipeline } finally { await pipeline.DisposeAsync(); } ``` ### Using 'using' Statements for Automatic Cleanup Leverage C# language features for automatic resource management: ```csharp // Automatic disposal with 'using' statement using (var pipeline = new MediaBlocksPipeline()) { // Configure and use pipeline await pipeline.StartAsync(); // Pipeline will be automatically disposed when exiting this block } // C# 8.0+ using declaration using var pipeline = new MediaBlocksPipeline(); // Pipeline will be disposed when the containing method exits ``` ## Advanced Pipeline Features ### Playback Rate Control Adjust playback speed for slow-motion or fast-forward effects: ```csharp // Get current playback rate double currentRate = await pipeline.Rate_GetAsync(); // Set playback rate (1.0 is normal speed) await pipeline.Rate_SetAsync(0.5); // Slow motion (half speed) await pipeline.Rate_SetAsync(2.0); // Double speed ``` ### Loop Playback Configuration Implement continuous playback functionality: ```csharp // Enable looping for continuous playback pipeline.Loop = true; // Listen for loop events pipeline.OnLoop += (sender, args) => { Console.WriteLine("Media has looped back to start"); UpdateLoopCounter(); }; ``` ### Debug Mode for Development Enable debugging features during development: ```csharp // Enable debug mode for more detailed logging pipeline.Debug_Mode = true; pipeline.Debug_Dir = Path.Combine(Environment.GetFolderPath( Environment.SpecialFolder.MyDocuments), "PipelineDebugLogs"); ``` ## Block Types Reference The SDK provides a wide range of block types for sources, processing, and sinks. See the `MediaBlockType` enum in the source code for a full list of available block types. ## Notes - The pipeline supports both synchronous and asynchronous methods for starting, stopping, and disposing. Prefer asynchronous methods in UI or long-running applications. - Events are available for error handling, state changes, and stream information. - Use the correct interface for each block type to access specialized features (e.g., dynamic inputs, rendering, settings). ---END OF PAGE--- # Local File: .\dotnet\mediablocks\GettingStarted\player.md --- title: Media Blocks SDK .Net Player Implementation Guide description: Learn how to build a robust video player application with Media Blocks SDK .Net. This step-by-step tutorial covers essential components including source blocks, video rendering, audio output configuration, pipeline creation, and advanced playback controls for .NET developers. sidebar_label: Player Sample --- # Building a Feature-Rich Video Player with Media Blocks SDK [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) This detailed tutorial walks you through the process of creating a professional-grade video player application using Media Blocks SDK .Net. By following these instructions, you'll understand how to implement key functionalities including media loading, playback control, and audio-video rendering. ## Essential Components for Your Player Application To construct a fully functional video player, your application pipeline requires these critical building blocks: - [Universal source](../Sources/index.md) - This versatile component handles media input from various sources, allowing your player to read and process video files from local storage or network streams. - [Video renderer](../VideoRendering/index.md) - The visual component responsible for displaying video frames on screen with proper timing and formatting. - [Audio renderer](../AudioRendering/index.md) - Manages sound output, ensuring synchronized audio playback alongside your video content. ## Setting Up the Media Pipeline ### Creating the Foundation The first step in developing your player involves establishing the media pipeline—the core framework that manages data flow between components. ```csharp using VisioForge.Core.MediaBlocks; var pipeline = new MediaBlocksPipeline(); ``` ### Implementing Error Handling Robust error management is essential for a reliable player application. Subscribe to the pipeline's error events to capture and respond to exceptions. ```csharp pipeline.OnError += (sender, args) => { Console.WriteLine(args.Message); // Additional error handling logic can be implemented here }; ``` ### Setting Up Event Listeners For complete control over your player's lifecycle, implement event handlers for critical state changes: ```csharp pipeline.OnStart += (sender, args) => { // Execute code when pipeline starts Console.WriteLine("Playback started"); }; pipeline.OnStop += (sender, args) => { // Execute code when pipeline stops Console.WriteLine("Playback stopped"); }; ``` ## Configuring Media Blocks ### Initializing the Source Block The Universal Source Block serves as the entry point for media content. Configure it with the path to your media file: ```csharp var sourceSettings = await UniversalSourceSettings.CreateAsync(new Uri(filePath)); var fileSource = new UniversalSourceBlock(sourceSettings); ``` During initialization, the SDK automatically analyzes the file to extract crucial metadata about video and audio streams, enabling proper configuration of downstream components. ### Setting Up Video Display To render video content on screen, create and configure a Video Renderer Block: ```csharp var videoRenderer = new VideoRendererBlock(_pipeline, VideoView1); ``` The renderer requires two parameters: a reference to your pipeline and the UI control where video frames will be displayed. ### Configuring Audio Output For audio playback, you'll need to select and initialize an appropriate audio output device: ```csharp var audioRenderers = await DeviceEnumerator.Shared.AudioOutputsAsync(); var audioRenderer = new AudioRendererBlock(audioRenderers[0]); ``` This code retrieves available audio output devices and configures the first available option for playback. ## Establishing Component Connections Once all blocks are configured, you must establish connections between them to create a cohesive media flow: ```csharp pipeline.Connect(fileSource.VideoOutput, videoRenderer.Input); pipeline.Connect(fileSource.AudioOutput, audioRenderer.Input); ``` These connections define the path data takes through your application: - Video data flows from the source to the video renderer - Audio data flows from the source to the audio renderer For files containing only video or audio, you can selectively connect only the relevant outputs. ### Validating Media Content Before playback, you can inspect available streams using the Universal Source Settings: ```csharp var mediaInfo = await sourceSettings.ReadInfoAsync(); bool hasVideo = mediaInfo.VideoStreams.Count > 0; bool hasAudio = mediaInfo.AudioStreams.Count > 0; ``` ## Controlling Media Playback ### Starting Playback To begin media playback, call the pipeline's asynchronous start method: ```csharp await pipeline.StartAsync(); ``` Once executed, your application will begin rendering video frames and playing audio through the configured outputs. ### Managing Playback State To halt playback, invoke the pipeline's stop method: ```csharp await pipeline.StopAsync(); ``` This gracefully terminates all media processing and releases associated resources. ## Advanced Implementation For a complete implementation example with additional features like seeking, volume control, and full-screen support, refer to our comprehensive source code on [GitHub](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Player%20Demo%20WPF). The repository contains working demonstrations for various platforms including WPF, Windows Forms, and cross-platform .NET applications. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Guides\rtsp-save-original-stream.md --- title: Save Original RTSP Stream (No Video Re-encoding) description: Learn how to save an RTSP stream to file (MP4) from your IP camera without re-encoding video. This guide covers how to record RTSP video streams, a common task when users want to save camera footage. Alternatives like ffmpeg save rtsp stream or VLC save rtsp stream to file exist, but this method uses .NET with VisioForge Media Blocks for programmatic control. sidebar_label: Save RTSP Video without Re-encoding order: 20 --- # How to Save RTSP Stream to File: Record IP Camera Video without Re-encoding [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Table of Contents - [How to Save RTSP Stream to File: Record IP Camera Video without Re-encoding](#how-to-save-rtsp-stream-to-file-record-ip-camera-video-without-re-encoding) - [Table of Contents](#table-of-contents) - [Overview](#overview) - [Core Features](#core-features) - [Core Concept](#core-concept) - [Prerequisites](#prerequisites) - [Code Sample: RTSPRecorder Class](#code-sample-rtsprecorder-class) - [Explanation of the Code](#explanation-of-the-code) - [How to Use the `RTSPRecorder`](#how-to-use-the-rtsprecorder) - [Key Considerations](#key-considerations) - [Full GitHub Sample](#full-github-sample) - [Best Practices](#best-practices) - [Troubleshooting](#troubleshooting) ## Overview This guide demonstrates how to save an RTSP stream to an MP4 file by capturing the original video stream from an RTSP IP camera without re-encoding the video. This approach is highly beneficial for preserving the original video quality from cameras and minimizing CPU usage when you need to record footage. The audio stream can be passed through or, optionally, re-encoded for better compatibility, allowing you to save the complete streaming data. Tools like FFmpeg and VLC offer command-line or UI-based methods to record an RTSP stream; however, this guide focuses on a programmatic approach using the VisioForge Media Blocks SDK for .NET developers who need to create applications that connect to and record video from RTSP cameras. ## Core Features - **Direct Stream Recording**: Save RTSP camera feeds without quality loss - **CPU-Efficient Processing**: No video re-encoding required - **Flexible Audio Handling**: Pass-through or re-encode audio as needed - **Professional Integration**: Programmatic control for enterprise applications - **High Performance**: Optimized for continuous recording We will be using the VisioForge Media Blocks SDK, a powerful .NET library for building custom media processing applications, to effectively save RTSP to file. ## Core Concept The main idea is to take the raw video stream from the RTSP source and directly send it to a file sink (e.g., MP4 muxer) without any decoding or encoding steps for the video. This is a common requirement for recording RTSP streams with maximum fidelity. - **Video Stream**: Passed through directly from the RTSP source to the MP4 sink. This ensures the original video data is saved, crucial for applications that need to record high-quality footage from cameras. - **Audio Stream**: Can either be passed through directly (if the original audio codec is compatible with the MP4 container) or re-encoded (e.g., to AAC) to ensure compatibility and potentially reduce file size when you save the RTSP stream. ## Prerequisites You'll need the VisioForge Media Blocks SDK. You can add it to your .NET project via NuGet: ```xml ``` Depending on your target platform (Windows, macOS, Linux, including ARM-based systems like Jetson Nano for embedded camera applications), you will also need the corresponding native runtime packages. For example, on Windows to record video: ```xml ``` For detailed information about deployment requirements, and platform-specific dependencies, please refer to our [Deployment Guide](../../deployment-x/index.md). It's important to check these details to ensure your video stream capture application works correctly. Refer to the `RTSP Capture Original.csproj` file in the sample project for a complete list of dependencies for different platforms. ## Code Sample: RTSPRecorder Class The following C# code defines an `RTSPRecorder` class that encapsulates the logic for capturing and saving the RTSP stream. ```csharp using System; using System.Threading.Tasks; using VisioForge.Core.MediaBlocks; using VisioForge.Core.MediaBlocks.AudioEncoders; using VisioForge.Core.MediaBlocks.Sinks; using VisioForge.Core.MediaBlocks.Sources; using VisioForge.Core.MediaBlocks.Special; using VisioForge.Core.Types.Events; using VisioForge.Core.Types.X.AudioEncoders; using VisioForge.Core.Types.X.Sinks; using VisioForge.Core.Types.X.Sources; namespace RTSPCaptureOriginalStream { /// /// RTSPRecorder class encapsulates the RTSP recording functionality to save RTSP stream to file. /// It uses the MediaBlocks SDK to create a pipeline that connects an /// RTSP source (like an IP camera) to an MP4 sink (file). /// public class RTSPRecorder : IAsyncDisposable { /// /// The MediaBlocks pipeline that manages the flow of media data. /// public MediaBlocksPipeline Pipeline { get; private set; } // Private fields for the MediaBlock components private MediaBlock _muxer; // MP4 container muxer (sink) private RTSPRAWSourceBlock _rtspRawSource; // RTSP stream source (provides raw streams) private DecodeBinBlock _decodeBin; // Optional: Audio decoder (if re-encoding audio) private AACEncoderBlock _audioEncoder; // Optional: AAC audio encoder (if re-encoding audio) private bool disposedValue; // Flag to prevent multiple disposals /// /// Event fired when an error occurs in the pipeline. /// public event EventHandler OnError; /// /// Event fired when a status message is available. /// public event EventHandler OnStatusMessage; /// /// Output filename for the MP4 recording. /// public string Filename { get; set; } = "output.mp4"; /// /// Whether to re-encode audio to AAC format (recommended for compatibility). /// If false, audio is passed through. /// public bool ReencodeAudio { get; set; } = true; /// /// Starts the recording session by creating and configuring the MediaBlocks pipeline. /// /// RTSP source configuration settings. /// True if the pipeline started successfully, false otherwise. public async Task StartAsync(RTSPRAWSourceSettings rtspSettings) { // Create a new MediaBlocks pipeline Pipeline = new MediaBlocksPipeline(); Pipeline.OnError += (sender, e) => OnError?.Invoke(this, e); // Bubble up errors OnStatusMessage?.Invoke(this, "Creating pipeline to record RTSP stream..."); // 1. Create the RTSP source block. // RTSPRAWSourceBlock provides raw, un-decoded elementary streams (video and audio) from your IP camera or other RTSP cameras. _rtspRawSource = new RTSPRAWSourceBlock(rtspSettings); // 2. Create the MP4 sink (muxer) block. // This block will write the media streams into an MP4 file. _muxer = new MP4SinkBlock(new MP4SinkSettings(Filename)); // 3. Connect Video Stream (Passthrough) // Create a dynamic input pad on the muxer for the video stream. // We connect the raw video output from the RTSP source directly to the MP4 sink. // This ensures the video is not re-encoded when you record the camera feed. var inputVideoPad = (_muxer as IMediaBlockDynamicInputs).CreateNewInput(MediaBlockPadMediaType.Video); Pipeline.Connect(_rtspRawSource.VideoOutput, inputVideoPad); OnStatusMessage?.Invoke(this, "Video stream connected (passthrough for original quality video)."); // 4. Connect Audio Stream (Optional Re-encoding) // This section handles how the audio from the RTSP stream is processed and saved to the file. if (rtspSettings.AudioEnabled) { // Create a dynamic input pad on the muxer for the audio stream. var inputAudioPad = (_muxer as IMediaBlockDynamicInputs).CreateNewInput(MediaBlockPadMediaType.Audio); if (ReencodeAudio) { // If audio re-encoding is enabled (e.g., to AAC for compatibility): OnStatusMessage?.Invoke(this, "Setting up audio re-encoding to AAC for the recording..."); // Create a decoder block that only handles audio. // We need to decode the original audio before re-encoding it to save the MP4 stream with compatible audio. _decodeBin = new DecodeBinBlock(videoDisabled: false, audioDisabled: true, subtitlesDisabled: false) { // We can disable the internal audio converter if we're sure about the format // or if the encoder handles conversion. For AAC, it's generally fine. DisableAudioConverter = true }; // Create an AAC encoder with default settings. _audioEncoder = new AACEncoderBlock(new AVENCAACEncoderSettings()); // Connect the audio processing pipeline: // RTSP audio output -> Decoder -> AAC Encoder -> MP4 Sink audio input Pipeline.Connect(_rtspRawSource.AudioOutput, _decodeBin.Input); Pipeline.Connect(_decodeBin.AudioOutput, _audioEncoder.Input); Pipeline.Connect(_audioEncoder.Output, inputAudioPad); OnStatusMessage?.Invoke(this, "Audio stream connected (re-encoding to AAC for MP4 file)."); } else { // If audio re-encoding is disabled, connect RTSP audio directly to the muxer. // Note: This may cause issues if the original audio format is not // compatible with the MP4 container (e.g., G.711 PCMU/PCMA) when trying to save the RTSP stream. // Common compatible formats include AAC. Check your camera's audio format. Pipeline.Connect(_rtspRawSource.AudioOutput, inputAudioPad); OnStatusMessage?.Invoke(this, "Audio stream connected (passthrough). Warning: Compatibility depends on original camera audio format for the file."); } } // 5. Start the pipeline to record video OnStatusMessage?.Invoke(this, "Starting recording pipeline to save RTSP stream to file..."); bool success = await Pipeline.StartAsync(); if (success) { OnStatusMessage?.Invoke(this, "Recording pipeline started successfully."); } else { OnStatusMessage?.Invoke(this, "Failed to start recording pipeline."); } return success; } /// /// Stops the recording by stopping the MediaBlocks pipeline. /// /// True if the pipeline stopped successfully, false otherwise. public async Task StopAsync() { if (Pipeline == null) return false; OnStatusMessage?.Invoke(this, "Stopping recording pipeline..."); bool success = await Pipeline.StopAsync(); if (success) { OnStatusMessage?.Invoke(this, "Recording pipeline stopped successfully."); } else { OnStatusMessage?.Invoke(this, "Failed to stop recording pipeline."); } // Detach the error handler to prevent issues if StopAsync is called multiple times // or before DisposeAsync if (Pipeline != null) { Pipeline.OnError -= OnError; } return success; } /// /// Asynchronously disposes of the RTSPRecorder and all its resources. /// Implements the IAsyncDisposable pattern for proper resource cleanup. /// public async ValueTask DisposeAsync() { if (!disposedValue) { if (Pipeline != null) { Pipeline.OnError -= (sender, e) => OnError?.Invoke(this, e); // Ensure detachment await Pipeline.DisposeAsync(); Pipeline = null; } // Dispose of all MediaBlock components // Using 'as IDisposable' for safe casting and disposal. (_muxer as IDisposable)?.Dispose(); _muxer = null; _rtspRawSource?.Dispose(); _rtspRawSource = null; _decodeBin?.Dispose(); _decodeBin = null; _audioEncoder?.Dispose(); _audioEncoder = null; disposedValue = true; } } } } ``` ## Explanation of the Code 1. **`RTSPRecorder` Class**: This class is central to helping a user save RTSP stream to file. - Implements `IAsyncDisposable` for proper resource management. - `Pipeline`: The `MediaBlocksPipeline` object that orchestrates the media flow. - `_rtspRawSource`: An `RTSPRAWSourceBlock` is used. The "RAW" is key here, as it provides the elementary streams (video and audio) from camera without attempting to decode them initially. - `_muxer`: An `MP4SinkBlock` is used to write the incoming video and audio streams into an MP4 file. - `_decodeBin` and `_audioEncoder`: These are optional blocks used only if `ReencodeAudio` is true. `_decodeBin` decodes the original audio from the IP camera, and `_audioEncoder` (e.g., `AACEncoderBlock`) re-encodes it to a more compatible format like AAC. - `Filename`: Specifies the output MP4 file path where the video will be saved. - `ReencodeAudio`: A boolean property to control audio processing. If `true`, audio is re-encoded to AAC. If `false`, audio is passed through directly. Check your camera audio format for compatibility if set to false. 2. **`StartAsync(RTSPRAWSourceSettings rtspSettings)` Method**: This method initiates the process to **record RTSP stream**. - Initializes `MediaBlocksPipeline`. - **RTSP Source**: Creates `_rtspRawSource` with `RTSPRAWSourceSettings`. These settings include the URL (the path to your camera's stream), credentials for user access, and audio capture settings. - **MP4 Sink**: Creates `_muxer` (MP4 sink) with the target filename. - **Video Path (Passthrough)**: - A new dynamic input pad for video is created on the `_muxer`. - `Pipeline.Connect(_rtspRawSource.VideoOutput, inputVideoPad);` This line directly connects the raw video output from the RTSP source to the MP4 muxer's*video input. No re-encoding occurs for the video stream. - **Audio Path (Conditional)**: Determines how audio from the **camera** is handled when you **save to file**. - If `rtspSettings.AudioEnabled` is true: - A new dynamic input pad for audio is created on the `_muxer`. - If `ReencodeAudio` is `true` (recommended for wider file compatibility): - `_decodeBin` is created to decode the incoming audio from the camera. It's configured to only process audio (`audioDisabled: false`). - `_audioEncoder` (e.g., `AACEncoderBlock`) is created. - The pipeline is connected: `_rtspRawSource.AudioOutput` -> `_decodeBin.Input` -> `_decodeBin.AudioOutput` -> `_audioEncoder.Input` -> `_audioEncoder.Output` -> `inputAudioPad` (muxer's audio input). - If `ReencodeAudio` is `false`: - `Pipeline.Connect(_rtspRawSource.AudioOutput, inputAudioPad);` The raw audio output from the camera source is connected directly to the MP4 muxer. *Caution*: This relies on the original audio codec from the camera being compatible with the MP4 container (e.g., AAC). Formats like G.711 (PCMU/PCMA) are common in RTSP cameras but are not standard in MP4 and might lead to playback issues or require specialized players if you save this way. Check your camera's documentation. - Starts the pipeline using `Pipeline.StartAsync()` to begin the streaming video record process. 3. **`StopAsync()` Method**: Stops the `Pipeline`. 4. **`DisposeAsync()` Method**: - Cleans up all resources, including the pipeline and individual media blocks. ## How to Use the `RTSPRecorder` Here's a basic example of how you might use the `RTSPRecorder` class: ```csharp using System; using System.IO; using System.Threading; using System.Threading.Tasks; using VisioForge.Core; // For VisioForgeX.DestroySDK() using VisioForge.Core.Types.X.Sources; // For RTSPRAWSourceSettings using RTSPCaptureOriginalStream; // Namespace of your RTSPRecorder class class Demo { static async Task Main(string[] args) { Console.WriteLine("RTSP Camera to MP4 Capture (Original Video Stream)"); Console.WriteLine("-------------------------------------------------"); string rtspUrl = "rtsp://your_camera_ip:554/stream_path"; // Replace with your RTSP URL string username = "admin"; // Replace with your username, or empty if none string password = "password"; // Replace with your password, or empty if none string outputFilePath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyVideos), "rtsp_original_capture.mp4"); Directory.CreateDirectory(Path.GetDirectoryName(outputFilePath)); Console.WriteLine($"Capturing from: {rtspUrl}"); Console.WriteLine($"Saving to: {outputFilePath}"); Console.WriteLine("Press any key to stop recording..."); var cts = new CancellationTokenSource(); RTSPRecorder recorder = null; try { recorder = new RTSPRecorder { Filename = outputFilePath, ReencodeAudio = true // Set to false to pass through audio (check compatibility) }; recorder.OnError += (s, e) => Console.WriteLine($"ERROR: {e.Message}"); recorder.OnStatusMessage += (s, msg) => Console.WriteLine($"STATUS: {msg}"); // Configure RTSP source settings var rtspSettings = new RTSPRAWSourceSettings(new Uri(rtspUrl), audioEnabled: true) { Login = username, Password = password, // Adjust other settings as needed, e.g., transport protocol // RTSPTransport = VisioForge.Core.Types.RTSPTransport.TCP, }; if (await recorder.StartAsync(rtspSettings)) { Console.ReadKey(true); // Wait for a key press to stop } else { Console.WriteLine("Failed to start recording. Check status messages and RTSP URL/credentials."); } } catch (Exception ex) { Console.WriteLine($"An unexpected error occurred: {ex.Message}"); } finally { if (recorder != null) { Console.WriteLine("Stopping recording..."); await recorder.StopAsync(); await recorder.DisposeAsync(); Console.WriteLine("Recording stopped and resources disposed."); } // Important: Clean up VisioForge SDK resources on application exit VisioForgeX.DestroySDK(); } Console.WriteLine("Press any key to exit."); Console.ReadKey(true); } } ``` ## Key Considerations - **Audio Compatibility (Passthrough)**: If you choose `ReencodeAudio = false`, ensure the camera's audio codec (e.g., AAC, MP3) is compatible with the MP4 container. Common RTSP audio codecs like G.711 (PCMU/PCMA) are generally not directly supported in MP4 files and will likely result in silent audio or playback errors. Re-encoding to AAC is generally safer for wider compatibility. - **Network Conditions**: RTSP streaming is sensitive to network stability, so ensure a reliable network connection to the camera. - **Error Handling**: Robust applications should implement thorough error handling by subscribing to the `OnError` event of the `RTSPRecorder` (or directly from the `MediaBlocksPipeline`). - **Resource Management**: Always `DisposeAsync` the `RTSPRecorder` instance (and thus the `MediaBlocksPipeline`) when done to free up resources. `VisioForgeX.DestroySDK()` should be called once when your application exits. ## Full GitHub Sample For a complete, runnable console application demonstrating these concepts, including user input for RTSP details and dynamic duration display, please refer to the official VisioForge samples repository: - **[RTSP Capture Original Stream Sample on GitHub](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/Console/RTSP%20Capture%20Original)** This sample provides a more comprehensive example and showcases additional features. ## Best Practices - Always implement proper error handling - Monitor network stability for reliable streaming - Use appropriate audio encoding settings - Manage system resources effectively - Implement proper cleanup procedures ## Troubleshooting Common issues and their solutions when saving RTSP streams: - Network connectivity problems - Audio codec compatibility - Resource management - Stream initialization errors - Recording storage considerations --- This guide provides a foundational understanding of how to save an RTSP stream's original video while flexibly handling the audio stream using the VisioForge Media Blocks SDK. By leveraging the `RTSPRAWSourceBlock` and direct pipeline connections, you can achieve efficient, high-quality recordings. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\LiveVideoCompositor\index.md --- title: .Net Live Video Compositor description: Master real-time video compositing in .Net. Add/remove multiple live video/audio sources and outputs on the fly. Build dynamic streaming & recording apps. sidebar_label: Live Video Compositor --- # Live Video Compositor [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Live Video Compositor is a part of the [VisioForge Media Blocks SDK .Net](https://www.visioforge.com/media-blocks-sdk-net) that allows you to add and remove sources and outputs in real time to a pipeline. This allows you to create applications that simultaneously handle multiple video and audio sources. For example, the LVC allows you to start streaming to YouTube at just the right moment while simultaneously recording video to disk. Using the LVC, you can create an application similar to OBS Studio. Each source and output has its unique identifier that can be used to add and remove sources and outputs in real time. Each source and output has its own independent pipeline that can be started and stopped. ## Features - Supports multiple video and audio sources - Supports multiple video and audio outputs - Setting the position and size of video sources - Setting the transparency of video sources - Setting the volume of audio sources ## LiveVideoCompositor class The `LiveVideoCompositor` is the main class that allows the addition and removal of live sources and outputs to the pipeline. When creating it, it is necessary to specify the resolution and frame rate to use. All sources with a different frame rate will be automatically converted to the frame rate specified when creating the LVC. `LiveVideoCompositorSettings` allows you to set the video and audio parameters. Key properties include: - `MixerType`: Specifies the video mixer type (e.g., `LVCMixerType.OpenGL`, `LVCMixerType.D3D11` (Windows only), or `LVCMixerType.CPU`). - `AudioEnabled`: A boolean indicating whether the audio stream is enabled. - `VideoWidth`, `VideoHeight`, `VideoFrameRate`: Define the output video resolution and frame rate. - `AudioFormat`, `AudioSampleRate`, `AudioChannels`: Define the output audio parameters. - `VideoView`: An optional `IVideoView` for rendering video output directly. - `AudioOutput`: An optional `AudioRendererBlock` for rendering audio output directly. It is also necessary to set the maximum number of sources and outputs when designing your application, though this is not a direct parameter of `LiveVideoCompositorSettings`. ### Sample code 1. Create a new instance of the `LiveVideoCompositor` class. ```csharp var settings = new LiveVideoCompositorSettings(1920, 1080, VideoFrameRate.FPS_25); // Optionally, configure other settings like MixerType, AudioEnabled, etc. // settings.MixerType = LVCMixerType.OpenGL; // settings.AudioEnabled = true; var compositor = new LiveVideoCompositor(settings); ``` 2. Add video and audio sources and outputs (see below) 3. Start the pipeline. ```csharp await compositor.StartAsync(); ``` ## LVC Video Input The `LVCVideoInput` class is used to add video sources to the LVC pipeline. The class allows you to set the video parameters and the rectangle of the video source. You can use any block that has a video output pad. For example, you can use `VirtualVideoSourceBlock` to create a virtual video source or `SystemVideoSourceBlock` to capture video from the webcam. Key properties for `LVCVideoInput` include: - `Rectangle`: Defines the position and size of the video source within the compositor's output. - `ZOrder`: Determines the stacking order of overlapping video sources. - `ResizePolicy`: Specifies how the video source should be resized if its aspect ratio differs from the target rectangle (`LVCResizePolicy.Stretch`, `LVCResizePolicy.Letterbox`, `LVCResizePolicy.LetterboxToFill`). - `VideoView`: An optional `IVideoView` to preview this specific input source. ### Usage When creating an `LVCVideoInput` object, you must specify the `MediaBlock` to be used as the video data source, along with `VideoFrameInfoX` describing the video, a `Rect` for its placement, and whether it should `autostart`. ### Sample code #### Virtual video source The sample code below shows how to create an `LVCVideoInput` object with a `VirtualVideoSourceBlock` as the video source. ```csharp var rect = new Rect(0, 0, 640, 480); var name = "Video source [Virtual]"; var settings = new VirtualVideoSourceSettings(); var info = new VideoFrameInfoX(settings.Width, settings.Height, settings.FrameRate); var src = new LVCVideoInput(name, _compositor, new VirtualVideoSourceBlock(settings), info, rect, true); // Optionally, set ZOrder or ResizePolicy // src.ZOrder = 1; // src.ResizePolicy = LVCResizePolicy.Letterbox; if (await _compositor.Input_AddAsync(src)) { // added successfully } else { src.Dispose(); } ``` #### Screen source For Desktop platforms, we can capture the screen. The sample code below shows how to create an `LVCVideoInput` object with a `ScreenSourceBlock` as the video source. ```csharp var settings = new ScreenCaptureDX9SourceSettings(); settings.CaptureCursor = true; settings.Monitor = 0; settings.FrameRate = new VideoFrameRate(30); settings.Rectangle = new Rectangle(0, 0, 1920, 1080); var rect = new Rect(0, 0, 640, 480); var name = $"Screen source"; var info = new VideoFrameInfoX(settings.Rectangle.Width, settings.Rectangle.Height, settings.FrameRate); var src = new LVCVideoInput(name, _compositor, new ScreenSourceBlock(settings), info, rect, true); // Optionally, set ZOrder or ResizePolicy // src.ZOrder = 0; // src.ResizePolicy = LVCResizePolicy.Stretch; if (await _compositor.Input_AddAsync(src)) { // added successfully } else { src.Dispose(); } ``` #### System video source (webcam) The sample code below shows how to create an `LVCVideoInput` object with a `SystemVideoSourceBlock` as the video source. We use the `DeviceEnumerator` class to get the video source devices. The first video device will be used as the video source. The first video format of the device will be used as the video format. ```csharp VideoCaptureDeviceSourceSettings settings = null; var device = (await DeviceEnumerator.Shared.VideoSourcesAsync())[0]; if (device != null) { var formatItem = device.VideoFormats[0]; if (formatItem != null) { settings = new VideoCaptureDeviceSourceSettings(device) { Format = formatItem.ToFormat() }; settings.Format.FrameRate = dlg.FrameRate; } } if (settings == null) { MessageBox.Show(this, "Unable to configure video capture device."); return; } var name = $"Camera source [{device.Name}]"; var rect = new Rect(0, 0, 1280, 720); var videoInfo = new VideoFrameInfoX(settings.Format.Width, settings.Format.Height, settings.Format.FrameRate); var src = new LVCVideoInput(name, _compositor, new SystemVideoSourceBlock(settings), videoInfo, rect, true); // Optionally, set ZOrder or ResizePolicy // src.ZOrder = 2; // src.ResizePolicy = LVCResizePolicy.LetterboxToFill; if (await _compositor.Input_AddAsync(src)) { // added successfully } else { src.Dispose(); } ``` ## LVC Audio Input The `LVCAudioInput` class is used to add audio sources to the LVC pipeline. The class allows you to set the audio parameters and the volume of the audio source. You can use any block that has an audio output pad. For example, you can use the `VirtualAudioSourceBlock` to create a virtual audio source or `SystemAudioSourceBlock` to capture audio from the microphone. ### Usage When creating an `LVCAudioInput` object, you must specify the `MediaBlock` to be used as the audio data source, along with `AudioInfoX` (which requires format, channels, and sample rate) and whether it should `autostart`. ### Sample code #### Virtual audio source The sample code below shows how to create an `LVCAudioInput` object with a `VirtualAudioSourceBlock` as the audio source. ```csharp var name = "Audio source [Virtual]"; var settings = new VirtualAudioSourceSettings(); var info = new AudioInfoX(settings.Format, settings.SampleRate, settings.Channels); var src = new LVCAudioInput(name, _compositor, new VirtualAudioSourceBlock(settings), info, true); if (await _compositor.Input_AddAsync(src)) { // added successfully } else { src.Dispose(); } ``` #### System audio source (DirectSound in Windows) The sample code below shows how to create an `LVCAudioInput` object with a `SystemAudioSourceBlock` as the audio source. We use the `DeviceEnumerator` class to get the audio devices. The first audio device is used as the audio source. The first audio format of the device is used as the audio format. ```csharp DSAudioCaptureDeviceSourceSettings settings = null; AudioCaptureDeviceFormat deviceFormat = null; var device = (await DeviceEnumerator.Shared.AudioSourcesAsync(AudioCaptureDeviceAPI.DirectSound))[0]]; if (device != null) { var formatItem = device.Formats[0]; if (formatItem != null) { deviceFormat = formatItem.ToFormat(); settings = new DSAudioCaptureDeviceSourceSettings(device, deviceFormat); } } if (settings == null) { MessageBox.Show(this, "Unable to configure audio capture device."); return; } var name = $"Audio source [{device.Name}]"; var info = new AudioInfoX(deviceFormat.Format, deviceFormat.SampleRate, deviceFormat.Channels); var src = new LVCAudioInput(name, _compositor, new SystemAudioSourceBlock(settings), info, true); if (await _compositor.Input_AddAsync(src)) { // added successfully } else { src.Dispose(); } ``` ## LVC Video Output The `LVCVideoOutput` class is used to add video outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ### Usage When creating an `LVCVideoOutput` object, you must specify the `MediaBlock` to be used as the video data output, its `name`, a reference to the `LiveVideoCompositor`, and whether it should `autostart` with the main pipeline. An optional processing `MediaBlock` can also be provided. Usually, this element is used to save the video as a file or stream it (without audio). For video+audio outputs, use the `LVCVideoAudioOutput` class. You can use the SuperMediaBlock to make a custom block pipeline for video output. For example, you can add a video encoder, a muxer, and a file writer to save the video to a file. ## LVC Audio Output The `LVCAudioOutput` class is used to add audio outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ### Usage When creating an `LVCAudioOutput` object, you must specify the `MediaBlock` to be used as the audio data output, its `name`, a reference to the `LiveVideoCompositor`, and whether it should `autostart`. ### Sample code #### Add an audio renderer Add an audio renderer to the LVC pipeline. You need to create an `AudioRendererBlock` object and then create an `LVCAudioOutput` object. Finally, add the output to the compositor. The first device is used as an audio output. ```csharp var audioRenderer = new AudioRendererBlock((await DeviceEnumerator.Shared.AudioOutputsAsync())[0]); var audioRendererOutput = new LVCAudioOutput("Audio renderer", _compositor, audioRenderer, true); await _compositor.Output_AddAsync(audioRendererOutput, true); ``` #### Add an MP3 output Add an MP3 output to the LVC pipeline. You need to create an `MP3OutputBlock` object and then create an `LVCAudioOutput` object. Finally, add the output to the compositor. ```csharp var mp3Output = new MP3OutputBlock(outputFile, new MP3EncoderSettings()); var output = new LVCAudioOutput(outputFile, _compositor, mp3Output, false); if (await _compositor.Output_AddAsync(output)) { // added successfully } else { output.Dispose(); } ``` ## LVC Video/Audio Output The `LVCVideoAudioOutput` class is used to add video+audio outputs to the LVC pipeline. You can start and stop the output pipeline independently from the main pipeline. ### Usage When creating an `LVCVideoAudioOutput` object, you must specify the `MediaBlock` to be used as the video+audio data output, its `name`, a reference to the `LiveVideoCompositor`, and whether it should `autostart`. Optional processing `MediaBlock`s for video and audio can also be provided. ### Sample code #### Add an MP4 output ```csharp var mp4Output = new MP4OutputBlock(new MP4SinkSettings("output.mp4"), new OpenH264EncoderSettings(), new MFAACEncoderSettings()); var output = new LVCVideoAudioOutput(outputFile, _compositor, mp4Output, false); if (await _compositor.Output_AddAsync(output)) { // added successfully } else { output.Dispose(); } ``` #### Add a WebM output ```csharp var webmOutput = new WebMOutputBlock(new WebMSinkSettings("output.webm"), new VP8EncoderSettings(), new VorbisEncoderSettings()); var output = new LVCVideoAudioOutput(outputFile, _compositor, webmOutput, false); if (await _compositor.Output_AddAsync(output)) { // added successfully } else { output.Dispose(); } ``` ## LVC Video View Output The `LVCVideoViewOutput` class is used to add video view to the LVC pipeline. You can use it to display the video on the screen. ### Usage When creating an `LVCVideoViewOutput` object, you must specify the `IVideoView` control to be used, its `name`, a reference to the `LiveVideoCompositor`, and whether it should `autostart`. An optional processing `MediaBlock` can also be provided. ### Sample code ```csharp var name = "[VideoView] Preview"; var videoRendererOutput = new LVCVideoViewOutput(name, _compositor, VideoView1, true); await _compositor.Output_AddAsync(videoRendererOutput); ``` VideoView1 is a `VideoView` object that is used to display the video. Each platform / UI framework has its own `VideoView` implementation. You can add several `LVCVideoViewOutput` objects to the LVC pipeline to display the video on different displays. --- [Sample application on GitHub](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Live%20Video%20Compositor%20Demo) ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Nvidia\index.md --- title: .Net Media Nvidia Blocks Guide description: Explore a complete guide to .Net Media SDK Nvidia blocks. Learn about Nvidia-specific blocks for your media processing pipelines. sidebar_label: Nvidia --- # Nvidia Blocks - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Nvidia blocks leverage Nvidia GPU capabilities for accelerated media processing tasks such as data transfer, video conversion, and resizing. ## NVDataDownloadBlock Nvidia data download block. Downloads data from Nvidia GPU to system memory. #### Block info Name: NVDataDownloadBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | Video (GPU memory) | 1 | | Output video | Video (system memory) | 1 | #### The sample pipeline ```mermaid graph LR; NVCUDAConverterBlock-->NVDataDownloadBlock-->VideoRendererBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create a source that outputs to GPU memory (e.g., a decoder or another Nvidia block) // For example, NVDataUploadBlock or an NV-accelerated decoder var upstreamNvidiaBlock = new NVDataUploadBlock(); // Conceptual: assume this block is properly configured // create Nvidia data download block var nvDataDownload = new NVDataDownloadBlock(); // create video renderer block var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control // connect blocks // pipeline.Connect(upstreamNvidiaBlock.Output, nvDataDownload.Input); // Connect GPU source to download block // pipeline.Connect(nvDataDownload.Output, videoRenderer.Input); // Connect download block (system memory) to renderer // start pipeline // await pipeline.StartAsync(); ``` #### Remarks This block is used to transfer video data from the Nvidia GPU's memory to the main system memory. This is typically needed when a GPU-processed video stream needs to be accessed by a component that operates on system memory, like a CPU-based encoder or a standard video renderer. Ensure that the correct Nvidia drivers and CUDA toolkit are installed for this block to function. Use `NVDataDownloadBlock.IsAvailable()` to check if the block can be used. #### Platforms Windows, Linux (Requires Nvidia GPU and appropriate drivers/SDK). ## NVDataUploadBlock Nvidia data upload block. Uploads data to Nvidia GPU from system memory. #### Block info Name: NVDataUploadBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | Video (system memory) | 1 | | Output video | Video (GPU memory) | 1 | #### The sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->NVDataUploadBlock-->NVH264EncoderBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create a video source (e.g., SystemVideoSourceBlock or UniversalSourceBlock) var videoSource = new UniversalSourceBlock(); // Conceptual: assume this block is properly configured // videoSource.Filename = "input.mp4"; // create Nvidia data upload block var nvDataUpload = new NVDataUploadBlock(); // create an Nvidia accelerated encoder (e.g., NVH264EncoderBlock) // var nvEncoder = new NVH264EncoderBlock(new NVH264EncoderSettings()); // Conceptual // connect blocks // pipeline.Connect(videoSource.VideoOutput, nvDataUpload.Input); // Connect system memory source to upload block // pipeline.Connect(nvDataUpload.Output, nvEncoder.Input); // Connect upload block (GPU memory) to NV encoder // start pipeline // await pipeline.StartAsync(); ``` #### Remarks This block is used to transfer video data from main system memory to the Nvidia GPU's memory. This is typically a prerequisite for using Nvidia-accelerated processing blocks like encoders, decoders, or filters that operate on GPU memory. Ensure that the correct Nvidia drivers and CUDA toolkit are installed for this block to function. Use `NVDataUploadBlock.IsAvailable()` to check if the block can be used. #### Platforms Windows, Linux (Requires Nvidia GPU and appropriate drivers/SDK). ## NVVideoConverterBlock Nvidia video converter block. Performs color space conversions and other video format conversions using the Nvidia GPU. #### Block info Name: NVVideoConverterBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | Video (GPU memory) | 1 | | Output video | Video (GPU memory, possibly different format) | 1 | #### The sample pipeline ```mermaid graph LR; NVDataUploadBlock-->NVVideoConverterBlock-->NVDataDownloadBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // Assume video data is already in GPU memory via NVDataUploadBlock or an NV-decoder // var nvUploadedSource = new NVDataUploadBlock(); // Conceptual // pipeline.Connect(systemMemorySource.Output, nvUploadedSource.Input); // create Nvidia video converter block var nvVideoConverter = new NVVideoConverterBlock(); // Specific conversion settings might be applied here if the block has properties for them. // Assume we want to download the converted video back to system memory // var nvDataDownload = new NVDataDownloadBlock(); // Conceptual // connect blocks // pipeline.Connect(nvUploadedSource.Output, nvVideoConverter.Input); // pipeline.Connect(nvVideoConverter.Output, nvDataDownload.Input); // pipeline.Connect(nvDataDownload.Output, videoRenderer.Input); // Or to another system memory component // start pipeline // await pipeline.StartAsync(); ``` #### Remarks The `NVVideoConverterBlock` is used for efficient video format conversions (e.g., color space, pixel format) leveraging the Nvidia GPU. This is often faster than CPU-based conversions, especially for high-resolution video. It typically operates on video data already present in GPU memory. Ensure that the correct Nvidia drivers and CUDA toolkit are installed. Use `NVVideoConverterBlock.IsAvailable()` to check if the block can be used. #### Platforms Windows, Linux (Requires Nvidia GPU and appropriate drivers/SDK). ## NVVideoResizeBlock Nvidia video resize block. Resizes video frames using the Nvidia GPU. #### Block info Name: NVVideoResizeBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | Video (GPU memory) | 1 | | Output video | Video (GPU memory, resized) | 1 | #### Settings The `NVVideoResizeBlock` is configured using a `VisioForge.Core.Types.Size` object passed to its constructor. - `Resolution` (`VisioForge.Core.Types.Size`): Specifies the target output resolution (Width, Height) for the video. #### The sample pipeline ```mermaid graph LR; NVDataUploadBlock-->NVVideoResizeBlock-->NVH264EncoderBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // Target resolution for resizing var targetResolution = new VisioForge.Core.Types.Size(1280, 720); // Assume video data is already in GPU memory via NVDataUploadBlock or an NV-decoder // var nvUploadedSource = new NVDataUploadBlock(); // Conceptual // pipeline.Connect(systemMemorySource.Output, nvUploadedSource.Input); // create Nvidia video resize block var nvVideoResize = new NVVideoResizeBlock(targetResolution); // Assume the resized video will be encoded by an NV-encoder // var nvEncoder = new NVH264EncoderBlock(new NVH264EncoderSettings()); // Conceptual // connect blocks // pipeline.Connect(nvUploadedSource.Output, nvVideoResize.Input); // pipeline.Connect(nvVideoResize.Output, nvEncoder.Input); // start pipeline // await pipeline.StartAsync(); ``` #### Remarks The `NVVideoResizeBlock` performs video scaling operations efficiently using the Nvidia GPU. This is useful for adapting video streams to different display resolutions or encoding requirements. It typically operates on video data already present in GPU memory. Ensure that the correct Nvidia drivers and CUDA toolkit are installed. Use `NVVideoResizeBlock.IsAvailable()` to check if the block can be used. #### Platforms Windows, Linux (Requires Nvidia GPU and appropriate drivers/SDK). ---END OF PAGE--- # Local File: .\dotnet\mediablocks\OpenCV\index.md --- title: .Net Media OpenCV Blocks Guide description: Explore a complete guide to .Net Media SDK OpenCV blocks. Learn about various OpenCV video processing capabilities. sidebar_label: OpenCV --- # OpenCV Blocks - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) OpenCV (Open Source Computer Vision Library) blocks provide powerful video processing capabilities within the VisioForge Media Blocks SDK .Net. These blocks enable a wide range of computer vision tasks, from basic image manipulation to complex object detection and tracking. To use OpenCV blocks, ensure that the VisioForge.CrossPlatform.OpenCV.Windows.x64 (or corresponding package for your platform) NuGet package is included in your project. Most OpenCV blocks typically require a `videoconvert` element before them to ensure the input video stream is in a compatible format. The SDK handles this internally when you initialize the block. ## CV Dewarp Block The CV Dewarp block applies dewarping effects to a video stream, which can correct distortions from wide-angle lenses, for example. ### Block info Name: `CVDewarpBlock` (GStreamer element: `dewarp`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVDewarpBlock` is configured using `CVDewarpSettings`. Key properties: - `DisplayMode` (`CVDewarpDisplayMode` enum): Specifies the display mode for dewarping (e.g., `SinglePanorama`, `DoublePanorama`). Default is `CVDewarpDisplayMode.SinglePanorama`. - `InnerRadius` (double): Inner radius for dewarping. - `InterpolationMethod` (`CVDewarpInterpolationMode` enum): Interpolation method used (e.g., `Bilinear`, `Bicubic`). Default is `CVDewarpInterpolationMode.Bilinear`. - `OuterRadius` (double): Outer radius for dewarping. - `XCenter` (double): X-coordinate of the center for dewarping. - `XRemapCorrection` (double): X-coordinate remap correction factor. - `YCenter` (double): Y-coordinate of the center for dewarping. - `YRemapCorrection` (double): Y-coordinate remap correction factor. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVDewarpBlock; CVDewarpBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' // Create Dewarp settings var dewarpSettings = new CVDewarpSettings { DisplayMode = CVDewarpDisplayMode.SinglePanorama, // Example mode, default is SinglePanorama InnerRadius = 0.2, // Example value OuterRadius = 0.8, // Example value XCenter = 0.5, // Example value, default is 0.5 YCenter = 0.5, // Example value, default is 0.5 // InterpolationMethod = CVDewarpInterpolationMode.Bilinear, // This is the default }; var dewarpBlock = new CVDewarpBlock(dewarpSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, dewarpBlock.Input0); pipeline.Connect(dewarpBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced in your project. ## CV Dilate Block The CV Dilate block performs a dilation operation on the video stream. Dilation is a morphological operation that typically expands bright regions and shrinks dark regions. ### Block info Name: `CVDilateBlock` (GStreamer element: `cvdilate`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings This block does not have specific settings beyond the default behavior. The dilation is performed with a default structuring element. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVDilateBlock; CVDilateBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var dilateBlock = new CVDilateBlock(); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, dilateBlock.Input0); pipeline.Connect(dilateBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced in your project. ## CV Edge Detect Block The CV Edge Detect block uses the Canny edge detector algorithm to find edges in the video stream. ### Block info Name: `CVEdgeDetectBlock` (GStreamer element: `edgedetect`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVEdgeDetectBlock` is configured using `CVEdgeDetectSettings`. Key properties: - `ApertureSize` (int): Aperture size for the Sobel operator (e.g., 3, 5, or 7). Default is 3. - `Threshold1` (int): First threshold for the hysteresis procedure. Default is 50. - `Threshold2` (int): Second threshold for the hysteresis procedure. Default is 150. - `Mask` (bool): If true, the output is a mask; otherwise, it's the original image with edges highlighted. Default is `false`. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVEdgeDetectBlock; CVEdgeDetectBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var edgeDetectSettings = new CVEdgeDetectSettings { ApertureSize = 3, // Example value, default is 3 Threshold1 = 2000, // Example value, actual C# type is int, default is 50 Threshold2 = 4000, // Example value, actual C# type is int, default is 150 Mask = true // Example value, default is false }; var edgeDetectBlock = new CVEdgeDetectBlock(edgeDetectSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, edgeDetectBlock.Input0); pipeline.Connect(edgeDetectBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced in your project. ## CV Equalize Histogram Block The CV Equalize Histogram block equalizes the histogram of a video frame using the `cvEqualizeHist` function. This typically improves the contrast of the image. ### Block info Name: `CVEqualizeHistogramBlock` (GStreamer element: `cvequalizehist`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings This block does not have specific settings beyond the default behavior. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVEqualizeHistogramBlock; CVEqualizeHistogramBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var equalizeHistBlock = new CVEqualizeHistogramBlock(); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, equalizeHistBlock.Input0); pipeline.Connect(equalizeHistBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced in your project. ## CV Erode Block The CV Erode block performs an erosion operation on the video stream. Erosion is a morphological operation that typically shrinks bright regions and expands dark regions. ### Block info Name: `CVErodeBlock` (GStreamer element: `cverode`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings This block does not have specific settings beyond the default behavior. The erosion is performed with a default structuring element. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVErodeBlock; CVErodeBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var erodeBlock = new CVErodeBlock(); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, erodeBlock.Input0); pipeline.Connect(erodeBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced in your project. ## CV Face Blur Block The CV Face Blur block detects faces in the video stream and applies a blur effect to them. ### Block info Name: `CVFaceBlurBlock` (GStreamer element: `faceblur`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVFaceBlurBlock` is configured using `CVFaceBlurSettings`. Key properties: - `MainCascadeFile` (string): Path to the XML file for the primary Haar cascade classifier used for face detection (e.g., `haarcascade_frontalface_default.xml`). Default is `"haarcascade_frontalface_default.xml"`. - `MinNeighbors` (int): Minimum number of neighbors each candidate rectangle should have to retain it. Default is 3. - `MinSize` (`Size`): Minimum possible object size. Objects smaller than this are ignored. Default `new Size(30, 30)`. - `ScaleFactor` (double): How much the image size is reduced at each image scale. Default is 1.25. Note: `ProcessPaths(Context)` should be called on the settings object to ensure correct path resolution for cascade files. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVFaceBlurBlock; CVFaceBlurBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var faceBlurSettings = new CVFaceBlurSettings { MainCascadeFile = "haarcascade_frontalface_default.xml", // Adjust path as needed, this is the default MinNeighbors = 5, // Example value, default is 3 ScaleFactor = 1.2, // Example value, default is 1.25 // MinSize = new VisioForge.Core.Types.Size(30, 30) // This is the default }; // It's important to call ProcessPaths if you are not providing an absolute path // and relying on SDK's internal mechanisms to locate the file, especially when deployed. // faceBlurSettings.ProcessPaths(pipeline.Context); // or pass appropriate context var faceBlurBlock = new CVFaceBlurBlock(faceBlurSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, faceBlurBlock.Input0); pipeline.Connect(faceBlurBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks This block requires Haar cascade XML files for face detection. These files are typically bundled with OpenCV distributions. Ensure the path to `MainCascadeFile` is correctly specified. The `ProcessPaths` method on the settings object can help resolve paths if files are placed in standard locations known to the SDK. ## CV Face Detect Block The CV Face Detect block detects faces, and optionally eyes, noses, and mouths, in the video stream using Haar cascade classifiers. ### Block info Name: `CVFaceDetectBlock` (GStreamer element: `facedetect`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVFaceDetectBlock` is configured using `CVFaceDetectSettings`. Key properties: - `Display` (bool): If `true`, draws rectangles around detected features on the output video. Default is `true`. - `MainCascadeFile` (string): Path to the XML for the primary Haar cascade. Default is `"haarcascade_frontalface_default.xml"`. - `EyesCascadeFile` (string): Path to the XML for eyes detection. Default is `"haarcascade_mcs_eyepair_small.xml"`. Optional. - `NoseCascadeFile` (string): Path to the XML for nose detection. Default is `"haarcascade_mcs_nose.xml"`. Optional. - `MouthCascadeFile` (string): Path to the XML for mouth detection. Default is `"haarcascade_mcs_mouth.xml"`. Optional. - `MinNeighbors` (int): Minimum neighbors for candidate retention. Default 3. - `MinSize` (`Size`): Minimum object size. Default `new Size(30, 30)`. - `MinDeviation` (int): Minimum standard deviation. Default 0. - `ScaleFactor` (double): Image size reduction factor at each scale. Default 1.25. - `UpdatesMode` (`CVFaceDetectUpdates` enum): Controls how updates/events are posted (`EveryFrame`, `OnChange`, `OnFace`, `None`). Default `CVFaceDetectUpdates.EveryFrame`. Note: `ProcessPaths(Context)` should be called on the settings object for cascade files. ### Events - `FaceDetected`: Occurs when faces (and other enabled features) are detected. Provides `CVFaceDetectedEventArgs` with an array of `CVFace` objects and a timestamp. - `CVFace` contains `Rect` for `Position`, `Nose`, `Mouth`, and a list of `Rect` for `Eyes`. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVFaceDetectBlock; CVFaceDetectBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var faceDetectSettings = new CVFaceDetectSettings { MainCascadeFile = "haarcascade_frontalface_default.xml", // Adjust path, default EyesCascadeFile = "haarcascade_mcs_eyepair_small.xml", // Adjust path, default, optional // NoseCascadeFile = "haarcascade_mcs_nose.xml", // Optional, default // MouthCascadeFile = "haarcascade_mcs_mouth.xml", // Optional, default Display = true, // Default UpdatesMode = CVFaceDetectUpdates.EveryFrame, // Default, possible values: EveryFrame, OnChange, OnFace, None MinNeighbors = 5, // Example value, default is 3 ScaleFactor = 1.2, // Example value, default is 1.25 // MinSize = new VisioForge.Core.Types.Size(30,30) // Default }; // faceDetectSettings.ProcessPaths(pipeline.Context); // or appropriate context var faceDetectBlock = new CVFaceDetectBlock(faceDetectSettings); faceDetectBlock.FaceDetected += (s, e) => { Console.WriteLine($"Timestamp: {e.Timestamp}, Faces found: {e.Faces.Length}"); foreach (var face in e.Faces) { Console.WriteLine($" Face at [{face.Position.Left},{face.Position.Top},{face.Position.Width},{face.Position.Height}]"); if (face.Eyes.Any()) { Console.WriteLine($" Eyes at [{face.Eyes[0].Left},{face.Eyes[0].Top},{face.Eyes[0].Width},{face.Eyes[0].Height}]"); } } }; var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, faceDetectBlock.Input0); pipeline.Connect(faceDetectBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Requires Haar cascade XML files. The `ProcessBusMessage` method in the C# class handles parsing messages from the GStreamer element to fire the `FaceDetected` event. ## CV Hand Detect Block The CV Hand Detect block detects hand gestures (fist or palm) in the video stream using Haar cascade classifiers. It internally resizes the input video to 320x240 for processing. ### Block info Name: `CVHandDetectBlock` (GStreamer element: `handdetect`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVHandDetectBlock` is configured using `CVHandDetectSettings`. Key properties: - `Display` (bool): If `true`, draws rectangles around detected hands on the output video. Default is `true`. - `FistCascadeFile` (string): Path to the XML for fist detection. Default is `"fist.xml"`. - `PalmCascadeFile` (string): Path to the XML for palm detection. Default is `"palm.xml"`. - `ROI` (`Rect`): Region Of Interest for detection. Coordinates are relative to the 320x240 processed image. Default (0,0,0,0) - full frame (corresponds to `new Rect()`). Note: `ProcessPaths(Context)` should be called on the settings object for cascade files. ### Events - `HandDetected`: Occurs when hands are detected. Provides `CVHandDetectedEventArgs` with an array of `CVHand` objects. - `CVHand` contains `Rect` for `Position` and `CVHandGesture` for `Gesture` (Fist or Palm). ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVHandDetectBlock; CVHandDetectBlock-->VideoRendererBlock; ``` Note: The `CVHandDetectBlock` internally includes a `videoscale` element to resize input to 320x240 before the `handdetect` GStreamer element. ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var handDetectSettings = new CVHandDetectSettings { FistCascadeFile = "fist.xml", // Adjust path, default PalmCascadeFile = "palm.xml", // Adjust path, default Display = true, // Default ROI = new VisioForge.Core.Types.Rect(0, 0, 320, 240) // Example: full frame of scaled image, default is new Rect() }; // handDetectSettings.ProcessPaths(pipeline.Context); // or appropriate context var handDetectBlock = new CVHandDetectBlock(handDetectSettings); handDetectBlock.HandDetected += (s, e) => { Console.WriteLine($"Hands found: {e.Hands.Length}"); foreach (var hand in e.Hands) { Console.WriteLine($" Hand at [{hand.Position.Left},{hand.Position.Top},{hand.Position.Width},{hand.Position.Height}], Gesture: {hand.Gesture}"); } }; var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, handDetectBlock.Input0); pipeline.Connect(handDetectBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Requires Haar cascade XML files for fist and palm detection. The input video is internally scaled to 320x240 for processing by the `handdetect` element. The `ProcessBusMessage` method handles GStreamer messages to fire `HandDetected`. ## CV Laplace Block The CV Laplace block applies a Laplace operator to the video stream, which highlights regions of rapid intensity change, often used for edge detection. ### Block info Name: `CVLaplaceBlock` (GStreamer element: `cvlaplace`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVLaplaceBlock` is configured using `CVLaplaceSettings`. Key properties: - `ApertureSize` (int): Aperture size for the Sobel operator used internally (e.g., 1, 3, 5, or 7). Default 3. - `Scale` (double): Optional scale factor for the computed Laplacian values. Default 1. - `Shift` (double): Optional delta value that is added to the results prior to storing them. Default 0. - `Mask` (bool): If true, the output is a mask; otherwise, it's the original image with the effect applied. Default is true. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVLaplaceBlock; CVLaplaceBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var laplaceSettings = new CVLaplaceSettings { ApertureSize = 3, // Example value Scale = 1.0, // Example value Shift = 0.0, // Example value Mask = true }; var laplaceBlock = new CVLaplaceBlock(laplaceSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, laplaceBlock.Input0); pipeline.Connect(laplaceBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced in your project. ## CV Motion Cells Block The CV Motion Cells block detects motion in a video stream by dividing the frame into a grid of cells and analyzing changes within these cells. ### Block info Name: `CVMotionCellsBlock` (GStreamer element: `motioncells`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVMotionCellsBlock` is configured using `CVMotionCellsSettings`. Key properties: - `CalculateMotion` (bool): Enable or disable motion calculation. Default `true`. - `CellsColor` (`SKColor`): Color to draw motion cells if `Display` is true. Default `SKColors.Red`. - `DataFile` (string): Path to a data file for loading/saving cell configuration. Extension is handled separately by `DataFileExtension`. - `DataFileExtension` (string): Extension for the data file (e.g., "dat"). - `Display` (bool): If `true`, draws the grid and motion indication on the output video. Default `true`. - `Gap` (`TimeSpan`): Interval after which motion is considered finished and a "motion finished" bus message is posted. Default `TimeSpan.FromSeconds(5)`. (Note: This is different from a pixel gap between cells). - `GridSize` (`Size`): Number of cells in the grid (Width x Height). Default `new Size(10, 10)`. - `MinimumMotionFrames` (int): Minimum number of frames motion must be detected in a cell to trigger. Default 1. - `MotionCellsIdx` (string): Comma-separated string of cell indices (e.g., "0:0,1:1") to monitor for motion. - `MotionCellBorderThickness` (int): Thickness of the border for cells with detected motion. Default 1. - `MotionMaskCellsPos` (string): String defining cell positions for a motion mask. - `MotionMaskCoords` (string): String defining coordinates for a motion mask. - `PostAllMotion` (bool): Post all motion events. Default `false`. - `PostNoMotion` (`TimeSpan`): Time after which a "no motion" event is posted if no motion is detected. Default `TimeSpan.Zero` (disabled). - `Sensitivity` (double): Motion sensitivity. Expected range might be 0.0 to 1.0. Default `0.5`. - `Threshold` (double): Threshold for motion detection, representing the fraction of cells that need to have moved. Default `0.01`. - `UseAlpha` (bool): Use alpha channel for drawing. Default `true`. ### Events - `MotionDetected`: Occurs when motion is detected or changes state. Provides `CVMotionCellsEventArgs`: - `Cells`: String indicating which cells have motion (e.g., "0:0,1:2"). - `StartedTime`: Timestamp when motion began in the current event scope. - `FinishedTime`: Timestamp when motion finished (if applicable to the event). - `CurrentTime`: Timestamp of the current frame related to the event. - `IsMotion`: Boolean indicating if the event signifies motion (`true`) or no motion (`false`). ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVMotionCellsBlock; CVMotionCellsBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var motionCellsSettings = new CVMotionCellsSettings { GridSize = new VisioForge.Core.Types.Size(8, 6), // Example: 8x6 grid, default is new Size(10,10) Sensitivity = 0.75, // Example value, C# default is 0.5. Represents sensitivity. Threshold = 0.05, // Example value, C# default is 0.01. Represents fraction of moved cells. Display = true, // Default is true CellsColor = SKColors.Aqua, // Example color, default is SKColors.Red PostNoMotion = TimeSpan.FromSeconds(5) // Post no_motion after 5s of inactivity, default is TimeSpan.Zero }; var motionCellsBlock = new CVMotionCellsBlock(motionCellsSettings); motionCellsBlock.MotionDetected += (s, e) => { if (e.IsMotion) { Console.WriteLine($"Motion DETECTED at {e.CurrentTime}. Cells: {e.Cells}. Started: {e.StartedTime}"); } else { Console.WriteLine($"Motion FINISHED or NO MOTION at {e.CurrentTime}. Finished: {e.FinishedTime}"); } }; var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, motionCellsBlock.Input0); pipeline.Connect(motionCellsBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks The `ProcessBusMessage` method handles GStreamer messages to fire `MotionDetected`. Event structure provides timestamps for motion start, finish, and current event time. ## CV Smooth Block The CV Smooth block applies various smoothing (blurring) filters to the video stream. ### Block info Name: `CVSmoothBlock` (GStreamer element: `cvsmooth`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVSmoothBlock` is configured using `CVSmoothSettings`. Key properties: - `Type` (`CVSmoothType` enum): Type of smoothing filter to apply (`Blur`, `Gaussian`, `Median`, `Bilateral`). Default `CVSmoothType.Gaussian`. - `KernelWidth` (int): Width of the kernel for `Blur`, `Gaussian`, `Median` filters. Default 3. - `KernelHeight` (int): Height of the kernel for `Blur`, `Gaussian`, `Median` filters. Default 3. - `Width` (int): Width of the area to blur. Default `int.MaxValue` (full frame). - `Height` (int): Height of the area to blur. Default `int.MaxValue` (full frame). - `PositionX` (int): X position for the blur area. Default 0. - `PositionY` (int): Y position for the blur area. Default 0. - `Color` (double): Sigma for color space (for Bilateral filter) or standard deviation (for Gaussian if `SpatialSigma` is 0). Default 0. - `SpatialSigma` (double): Sigma for coordinate space (for Bilateral and Gaussian filters). For Gaussian, if 0, it's calculated from `KernelWidth`/`KernelHeight`. Default 0. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVSmoothBlock; CVSmoothBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var smoothSettings = new CVSmoothSettings { Type = CVSmoothType.Gaussian, // Example: Gaussian blur, also the default KernelWidth = 5, // Kernel width, default is 3 KernelHeight = 5, // Kernel height, default is 3 SpatialSigma = 1.5 // Sigma for Gaussian. If 0 (default), it's calculated from kernel size. }; var smoothBlock = new CVSmoothBlock(smoothSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, smoothBlock.Input0); pipeline.Connect(smoothBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced in your project. The specific parameters used by the GStreamer element (`color`, `spatial`, `kernel-width`, `kernel-height`) depend on the chosen `Type`. For kernel dimensions, use `KernelWidth` and `KernelHeight`. `Width` and `Height` define the area to apply the blur if not the full frame. ## CV Sobel Block The CV Sobel block applies a Sobel operator to the video stream, which is used to calculate the derivative of an image intensity function, typically for edge detection. ### Block info Name: `CVSobelBlock` (GStreamer element: `cvsobel`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVSobelBlock` is configured using `CVSobelSettings`. Key properties: - `XOrder` (int): Order of the derivative x. Default 1. - `YOrder` (int): Order of the derivative y. Default 1. - `ApertureSize` (int): Size of the extended Sobel kernel (1, 3, 5, or 7). Default 3. - `Mask` (bool): If true, the output is a mask; otherwise, it's the original image with the effect applied. Default is true. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVSobelBlock; CVSobelBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var sobelSettings = new CVSobelSettings { XOrder = 1, // Default is 1. Used for order of the derivative X. YOrder = 0, // Example: Use 0 for Y-order to primarily detect vertical edges. C# class default is 1. ApertureSize = 3, // Default is 3. Size of the extended Sobel kernel. Mask = true // Default is true. Output as a mask. }; var sobelBlock = new CVSobelBlock(sobelSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, sobelBlock.Input0); pipeline.Connect(sobelBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced in your project. ## CV Template Match Block The CV Template Match block searches for occurrences of a template image within the video stream. ### Block info Name: `CVTemplateMatchBlock` (GStreamer element: `templatematch`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVTemplateMatchBlock` is configured using `CVTemplateMatchSettings`. Key properties: - `TemplateImage` (string): Path to the template image file (e.g., PNG, JPG) to search for. - `Method` (`CVTemplateMatchMethod` enum): The comparison method to use (e.g., `Sqdiff`, `CcorrNormed`, `CcoeffNormed`). Default `CVTemplateMatchMethod.Correlation`. - `Display` (bool): If `true`, draws a rectangle around the best match on the output video. Default `true`. ### Events - `TemplateMatch`: Occurs when a template match is found. Provides `CVTemplateMatchEventArgs`: - `Rect`: A `Types.Rect` object representing the location (x, y, width, height) of the best match. - `Result`: A double value representing the quality or result of the match, depending on the method used. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVTemplateMatchBlock; CVTemplateMatchBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' // Ensure "template.png" exists and is accessible. var templateMatchSettings = new CVTemplateMatchSettings("path/to/your/template.png") // Adjust path as needed { // Method: Specifies the comparison method. // Example: CVTemplateMatchMethod.CcoeffNormed is often a good choice. // C# class default is CVTemplateMatchMethod.Correlation. Method = CVTemplateMatchMethod.CcoeffNormed, // Display: If true, draws a rectangle around the best match. // C# class default is true. Display = true }; var templateMatchBlock = new CVTemplateMatchBlock(templateMatchSettings); templateMatchBlock.TemplateMatch += (s, e) => { Console.WriteLine($"Template matched at [{e.Rect.Left},{e.Rect.Top},{e.Rect.Width},{e.Rect.Height}] with result: {e.Result}"); }; var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, templateMatchBlock.Input0); pipeline.Connect(templateMatchBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package and a valid template image file are available. The `ProcessBusMessage` method handles GStreamer messages to fire the `TemplateMatch` event. ## CV Text Overlay Block The CV Text Overlay block renders text onto the video stream using OpenCV drawing functions. ### Block info Name: `CVTextOverlayBlock` (GStreamer element: `opencvtextoverlay`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVTextOverlayBlock` is configured using `CVTextOverlaySettings`. Key properties: - `Text` (string): The text string to overlay. Default: `"Default text"`. - `X` (int): X-coordinate of the bottom-left corner of the text string. Default: `50`. - `Y` (int): Y-coordinate of the bottom-left corner of the text string (from the top, OpenCV origin is top-left, GStreamer textoverlay might be bottom-left). Default: `50`. - `FontWidth` (double): Font scale factor that is multiplied by the font-specific base size. Default: `1.0`. - `FontHeight` (double): Font scale factor (similar to FontWidth, though GStreamer element usually has one `font-scale` or relies on point size). Default: `1.0`. - `FontThickness` (int): Thickness of the lines used to draw a text. Default: `1`. - `Color` (`SKColor`): Color of the text. Default: `SKColors.Black`. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVTextOverlayBlock; CVTextOverlayBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var textOverlaySettings = new CVTextOverlaySettings { Text = "VisioForge MediaBlocks.Net ROCKS!", // Default: "Default text" X = 20, // X position of the text start. Default: 50 Y = 40, // Y position of the text baseline (from top). Default: 50 FontWidth = 1.2, // Font scale. Default: 1.0 FontHeight = 1.2, // Font scale (usually FontWidth is sufficient for opencvtextoverlay). Default: 1.0 FontThickness = 2, // Default: 1 Color = SKColors.Blue // Default: SKColors.Black }; var textOverlayBlock = new CVTextOverlayBlock(textOverlaySettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, textOverlayBlock.Input0); pipeline.Connect(textOverlayBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced. The GStreamer properties `colorR`, `colorG`, `colorB` are set based on the `Color` property. ## CV Tracker Block The CV Tracker block implements various object tracking algorithms to follow an object defined by an initial bounding box in a video stream. ### Block info Name: `CVTrackerBlock` (GStreamer element: `cvtracker`). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Input video | Uncompressed video | 1 | | Output video | Uncompressed video | 1 | ### Settings The `CVTrackerBlock` is configured using `CVTrackerSettings`. Key properties: - `Algorithm` (`CVTrackerAlgorithm` enum): Specifies the tracking algorithm (`Boosting`, `CSRT`, `KCF`, `MedianFlow`, `MIL`, `MOSSE`, `TLD`). Default: `CVTrackerAlgorithm.MedianFlow`. - `InitialRect` (`Rect`): The initial bounding box (Left, Top, Width, Height) of the object to track. Default: `new Rect(50, 50, 100, 100)`. - `DrawRect` (bool): If `true`, draws a rectangle around the tracked object on the output video. Default: `true`. ### Sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->CVTrackerBlock; CVTrackerBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Assuming SystemVideoSourceBlock is already created and configured as 'videoSource' var trackerSettings = new CVTrackerSettings { Algorithm = CVTrackerAlgorithm.CSRT, // CSRT is often a good general-purpose tracker. Default: CVTrackerAlgorithm.MedianFlow InitialRect = new VisioForge.Core.Types.Rect(150, 120, 80, 80), // Define your initial object ROI. Default: new Rect(50, 50, 100, 100) DrawRect = true // Default: true }; var trackerBlock = new CVTrackerBlock(trackerSettings); // Note: The tracker initializes with InitialRect. // To re-initialize tracking on a new object/location at runtime: // 1. Pause or Stop the pipeline. // 2. Update trackerBlock.Settings.InitialRect (or create new CVTrackerSettings). // It's generally safer to update settings on a stopped/paused pipeline, // or if the block/element supports dynamic property changes, that might be an option. // Directly modifying `trackerBlock.Settings.InitialRect` might not re-initialize the underlying GStreamer element. // You may need to remove and re-add the block, or check SDK documentation for live update capabilities. // 3. Resume/Start the pipeline. var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(videoSource.Output, trackerBlock.Input0); pipeline.Connect(trackerBlock.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ### Remarks Ensure the VisioForge OpenCV NuGet package is referenced. The choice of tracking algorithm can significantly impact performance and accuracy. Some algorithms (like CSRT, KCF) are generally more robust than older ones (like Boosting, MedianFlow). Some trackers might require OpenCV contrib modules to be available in your OpenCV build/distribution. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\OpenGL\index.md --- title: .Net Media OpenGL Video Effects Guide description: Explore a comprehensive guide to OpenGL video effects available in VisioForge Media Blocks SDK .Net. Learn about various effects, their settings, and related OpenGL functionalities. sidebar_label: OpenGL Effects --- # OpenGL Video Effects - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) OpenGL video effects in VisioForge Media Blocks SDK .Net allow for powerful, hardware-accelerated manipulation of video streams. These effects can be applied to video content processed within an OpenGL context, typically via blocks like `GLVideoEffectsBlock` or custom OpenGL rendering pipelines. This guide covers the available effects, their configuration settings, and other related OpenGL types. ## Base Effect: `GLBaseVideoEffect` All OpenGL video effects inherit from the `GLBaseVideoEffect` class, which provides common properties and events. | Property | Type | Description | |----------|-----------------------|--------------------------------------------------| | `Name` | `string` | The internal name of the effect (read-only). | | `ID` | `GLVideoEffectID` | The unique identifier for the effect (read-only). | | `Index` | `int` | The index of the effect in a chain. | **Events:** - `OnUpdate`: Occurs when effect properties need to be updated in the pipeline. Call `OnUpdateCall()` to trigger it. ## Available Video Effects This section details the various OpenGL video effects you can use. These effects are typically added to a `GLVideoEffectsBlock` or a similar OpenGL processing element. ### Alpha Effect (`GLAlphaVideoEffect`) Replaces a selected color with an alpha channel or sets/adjusts the existing alpha channel. **Properties:** | Property | Type | Default Value | Description | |--------------------|--------------------------|------------------|--------------------------------------------------------| | `Alpha` | `double` | `1.0` | The value for the alpha channel. | | `Angle` | `float` | `20` | The size of the colorcube to change (sensitivity radius for color matching). | | `BlackSensitivity` | `uint` | `100` | The sensitivity to dark colors. | | `Mode` | `GLAlphaVideoEffectMode` | `Set` | The method used for alpha modification. | | `NoiseLevel` | `float` | `2` | The size of noise radius (pixels to ignore around the matched color). | | `CustomColor` | `SKColor` | `SKColors.Green` | Custom color value for `Custom` chroma key mode. | | `WhiteSensitivity` | `uint` | `100` | The sensitivity to bright colors. | **Associated Enum: `GLAlphaVideoEffectMode`** Defines the mode of operation for the Alpha video effect. | Value | Description | |----------|----------------------------------------| | `Set` | Set/adjust alpha channel directly using the `Alpha` property. | | `Green` | Chroma Key on pure green. | | `Blue` | Chroma Key on pure blue. | | `Custom` | Chroma Key on the color specified by `CustomColor`. | ### Blur Effect (`GLBlurVideoEffect`) Applies a blur effect using a 9x9 separable convolution. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Bulge Effect (`GLBulgeVideoEffect`) Creates a bulge distortion on the video. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Color Balance Effect (`GLColorBalanceVideoEffect`) Adjusts the color balance of the video, including brightness, contrast, hue, and saturation. **Properties:** | Property | Type | Default Value | Description | |--------------|----------|---------------|--------------------------------------------------| | `Brightness` | `double` | `0` | Adjusts brightness (-1.0 to 1.0, 0 means no change). | | `Contrast` | `double` | `1` | Adjusts contrast (0.0 to infinity, 1 means no change). | | `Hue` | `double` | `0` | Adjusts hue (-1.0 to 1.0, 0 means no change). | | `Saturation` | `double` | `1` | Adjusts saturation (0.0 to infinity, 1 means no change). | ### Deinterlace Effect (`GLDeinterlaceVideoEffect`) Applies a deinterlacing filter to the video. **Properties:** | Property | Type | Default Value | Description | |----------|-----------------------|-----------------|-------------------------------------| | `Method` | `GLDeinterlaceMethod` | `VerticalBlur` | The deinterlacing method to use. | **Associated Enum: `GLDeinterlaceMethod`** Defines the method for the Deinterlace video effect. | Value | Description | |----------------|-----------------------------------------| | `VerticalBlur` | Vertical blur method. | | `MAAD` | Motion Adaptive: Advanced Detection. | ### Fish Eye Effect (`GLFishEyeVideoEffect`) Applies a fish-eye lens distortion effect. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Flip Effect (`GLFlipVideoEffect`) Flips or rotates the video. **Properties:** | Property | Type | Default Value | Description | |----------|---------------------|---------------|-------------------------------------| | `Method` | `GLFlipVideoMethod` | `None` | The flip or rotation method to use. | **Associated Enum: `GLFlipVideoMethod`** Defines the video flip or rotation method. | Value | Description | |----------------------|----------------------------------------------| | `None` | No rotation. | | `Clockwise` | Rotate clockwise 90 degrees. | | `Rotate180` | Rotate 180 degrees. | | `CounterClockwise` | Rotate counter-clockwise 90 degrees. | | `HorizontalFlip` | Flip horizontally. | | `VerticalFlip` | Flip vertically. | | `UpperLeftDiagonal` | Flip across upper left/lower right diagonal. | | `UpperRightDiagonal` | Flip across upper right/lower left diagonal. | ### Glow Lighting Effect (`GLGlowLightingVideoEffect`) Adds a glow lighting effect to the video. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Grayscale Effect (`GLGrayscaleVideoEffect`) Converts the video to grayscale. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Heat Effect (`GLHeatVideoEffect`) Applies a heat signature-like effect to the video. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Laplacian Effect (`GLLaplacianVideoEffect`) Applies a Laplacian edge detection filter. **Properties:** | Property | Type | Default Value | Description | |----------|---------|---------------|-------------------------------------------------------------------| | `Invert` | `bool` | `false` | If `true`, inverts colors to get dark edges on a bright background. | ### Light Tunnel Effect (`GLLightTunnelVideoEffect`) Creates a light tunnel visual effect. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Luma Cross Processing Effect (`GLLumaCrossProcessingVideoEffect`) Applies a luma cross-processing (often "xpro") effect. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Mirror Effect (`GLMirrorVideoEffect`) Applies a mirror effect to the video. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Resize Effect (`GLResizeVideoEffect`) Resizes the video to the specified dimensions. **Properties:** | Property | Type | Description | |----------|-------|----------------------------------------| | `Width` | `int` | The target width for the video resize. | | `Height` | `int` | The target height for the video resize.| ### Sepia Effect (`GLSepiaVideoEffect`) Applies a sepia tone effect to the video. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Sin City Effect (`GLSinCityVideoEffect`) Applies a "Sin City" movie style effect (grayscale with red highlights). This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Sobel Effect (`GLSobelVideoEffect`) Applies a Sobel edge detection filter. **Properties:** | Property | Type | Default Value | Description | |----------|---------|---------------|-------------------------------------------------------------------| | `Invert` | `bool` | `false` | If `true`, inverts colors to get dark edges on a bright background. | ### Square Effect (`GLSquareVideoEffect`) Applies a "square" distortion or pixelation effect. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Squeeze Effect (`GLSqueezeVideoEffect`) Applies a squeeze distortion effect. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Stretch Effect (`GLStretchVideoEffect`) Applies a stretch distortion effect. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### Transformation Effect (`GLTransformationVideoEffect`) Applies 3D transformations to the video, including rotation, scaling, and translation. **Properties:** | Property | Type | Default Value | Description | |----------------|---------|---------------|-----------------------------------------------------------------------| | `FOV` | `float` | `90.0f` | Field of view angle in degrees for perspective projection. | | `Ortho` | `bool` | `false` | If `true`, uses orthographic projection; otherwise, perspective. | | `PivotX` | `float` | `0.0f` | X-coordinate of the rotation pivot point (0 is center). | | `PivotY` | `float` | `0.0f` | Y-coordinate of the rotation pivot point (0 is center). | | `PivotZ` | `float` | `0.0f` | Z-coordinate of the rotation pivot point (0 is center). | | `RotationX` | `float` | `0.0f` | Rotation around the X-axis in degrees. | | `RotationY` | `float` | `0.0f` | Rotation around the Y-axis in degrees. | | `RotationZ` | `float` | `0.0f` | Rotation around the Z-axis in degrees. | | `ScaleX` | `float` | `1.0f` | Scale multiplier for the X-axis. | | `ScaleY` | `float` | `1.0f` | Scale multiplier for the Y-axis. | | `TranslationX` | `float` | `0.0f` | Translation along the X-axis (universal coordinates [0-1]). | | `TranslationY` | `float` | `0.0f` | Translation along the Y-axis (universal coordinates [0-1]). | | `TranslationZ` | `float` | `0.0f` | Translation along the Z-axis (universal coordinates [0-1], depth). | ### Twirl Effect (`GLTwirlVideoEffect`) Applies a twirl distortion effect. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ### X-Ray Effect (`GLXRayVideoEffect`) Applies an X-ray like visual effect. This effect does not have additional configurable properties beyond those inherited from `GLBaseVideoEffect`. ## OpenGL Effect Identification: `GLVideoEffectID` Enum This enumeration lists all available OpenGL video effect types, used by `GLBaseVideoEffect.ID`. | Value | Description | |------------------|-------------------------------------------| | `ColorBalance` | The color balance effect. | | `Grayscale` | The grayscale effect. | | `Resize` | The resize effect. | | `Deinterlace` | The deinterlace effect. | | `Flip` | The flip effect. | | `Blur` | Blur with 9x9 separable convolution effect. | | `FishEye` | The fish eye effect. | | `GlowLighting` | The glow lighting effect. | | `Heat` | The heat signature effect. | | `LumaX` | The luma cross processing effect. | | `Mirror` | The mirror effect. | | `Sepia` | The sepia effect. | | `Square` | The square effect. | | `XRay` | The X-ray effect. | | `Stretch` | The stretch effect. | | `LightTunnel` | The light tunnel effect. | | `Twirl` | The twirl effect. | | `Squeeze` | The squeeze effect. | | `SinCity` | The sin city movie gray-red effect. | | `Bulge` | The bulge effect. | | `Sobel` | The sobel effect. | | `Laplacian` | The laplacian effect. | | `Alpha` | The alpha channels effect. | | `Transformation` | The transformation effect. | ## OpenGL Rendering and View Configuration These types assist in configuring how video is rendered or viewed in an OpenGL context, especially for specialized scenarios like VR or custom display setups. ### Equirectangular View Settings (`GLEquirectangularViewSettings`) Manages settings for rendering equirectangular (360-degree) video, commonly used in VR applications. Implements `IVRVideoControl`. **Properties:** | Property | Type | Default | Description | |-----------------|--------------|-------------------|------------------------------------------------| | `VideoWidth` | `int` | (readonly) | Width of the source video. | | `VideoHeight` | `int` | (readonly) | Height of the source video. | | `FieldOfView` | `float` | `80.0f` | Field of view in degrees. | | `Yaw` | `float` | `0.0f` | Yaw (rotation around Y-axis) in degrees. | | `Pitch` | `float` | `0.0f` | Pitch (rotation around X-axis) in degrees. | | `Roll` | `float` | `0.0f` | Roll (rotation around Z-axis) in degrees. | | `Mode` | `VRMode` | `Equirectangular` | The VR mode (supports `Equirectangular`). | **Methods:** - `IsModeSupported(VRMode mode)`: Checks if the specified `VRMode` is supported. **Events:** - `SettingsChanged`: Occurs when any view setting is changed. ### Video Renderer Settings (`GLVideoRendererSettings`) Configures general properties for an OpenGL-based video renderer. **Properties:** | Property | Type | Default | Description | |--------------------|-------------------------------|-------------|----------------------------------------------------------------------| | `ForceAspectRatio` | `bool` | `true` | Whether scaling will respect the original aspect ratio. | | `IgnoreAlpha` | `bool` | `true` | Whether alpha channel will be ignored (treated as black). | | `PixelAspectRatio` | `System.Tuple` | `(0, 1)` | Pixel aspect ratio of the display device (numerator, denominator). | | `Rotation` | `GLVideoRendererRotateMethod` | `None` | Specifies the rotation applied to the video. | **Associated Enum: `GLVideoRendererRotateMethod`** Defines rotation methods for the OpenGL video renderer. | Value | Description | |------------------|-----------------------------------------| | `None` | No rotation. | | `_90C` | Rotate 90 degrees clockwise. | | `_180` | Rotate 180 degrees. | | `_90CC` | Rotate 90 degrees counter-clockwise. | | `FlipHorizontal` | Flip horizontally. | | `FlipVertical` | Flip vertically. | ## Custom OpenGL Shaders Allows for the application of custom GLSL shaders to the video stream. ### Shader Definition (`GLShader`) Represents a pair of vertex and fragment shaders. **Properties:** | Property | Type | Description | |------------------|----------|-----------------------------------------------| | `VertexShader` | `string` | The GLSL source code for the vertex shader. | | `FragmentShader` | `string` | The GLSL source code for the fragment shader. | **Constructors:** - `GLShader()` - `GLShader(string vertexShader, string fragmentShader)` ### Shader Settings (`GLShaderSettings`) Configures custom GLSL shaders for use in the pipeline. **Properties:** | Property | Type | Description | |------------|--------------------------------------|--------------------------------------------------| | `Vertex` | `string` | The GLSL source code for the vertex shader. | | `Fragment` | `string` | The GLSL source code for the fragment shader. | | `Uniforms` | `System.Collections.Generic.Dictionary` | A dictionary of uniform variables (parameters) to be passed to the shaders. | **Constructors:** - `GLShaderSettings()` - `GLShaderSettings(string vertex, string fragment)` - `GLShaderSettings(GLShader shader)` ## Image Overlays in OpenGL Provides settings for overlaying static images onto a video stream within an OpenGL context. ### Overlay Settings (`GLOverlaySettings`) Defines the properties of an image overlay. **Properties:** | Property | Type | Default | Description | |------------|----------|---------|---------------------------------------------------| | `Filename` | `string` | (N/A) | Path to the image file (read-only after init). | | `Data` | `byte[]` | (N/A) | Image data as a byte array (read-only after init).| | `X` | `int` | | X-coordinate of the overlay's top-left corner. | | `Y` | `int` | | Y-coordinate of the overlay's top-left corner. | | `Width` | `int` | | Width of the overlay. | | `Height` | `int` | | Height of the overlay. | | `Alpha` | `double` | `1.0` | Opacity of the overlay (0.0 transparent to 1.0 opaque). | **Constructor:** - `GLOverlaySettings(string filename)` ## OpenGL Video Mixing These types are used to configure an OpenGL-based video mixer, allowing multiple video streams to be combined and composited. ### Mixer Settings (`GLVideoMixerSettings`) Extends `VideoMixerBaseSettings` for OpenGL-specific video mixing. It manages a list of `GLVideoMixerStream` objects and inherits properties like `Width`, `Height`, and `FrameRate`. **Methods:** - `AddStream(GLVideoMixerStream stream)`: Adds a stream to the mixer. - `RemoveStream(GLVideoMixerStream stream)`: Removes a stream from the mixer. - `SetStream(int index, GLVideoMixerStream stream)`: Replaces a stream at a specific index. **Constructors:** - `GLVideoMixerSettings(int width, int height, VideoFrameRate frameRate)` - `GLVideoMixerSettings(int width, int height, VideoFrameRate frameRate, List streams)` ### Mixer Stream (`GLVideoMixerStream`) Extends `VideoMixerStream` and defines properties for an individual stream within the OpenGL video mixer. Inherits `Rectangle`, `ZOrder`, and `Alpha` from `VideoMixerStream`. **Properties:** | Property | Type | Default | Description | |---------------------------------|-------------------------------|------------------------------|--------------------------------------------------| | `Crop` | `Rect` | (N/A) | Crop rectangle for the input stream. | | `BlendConstantColorAlpha` | `double` | `0` | Alpha component for constant blend color. | | `BlendConstantColorBlue` | `double` | `0` | Blue component for constant blend color. | | `BlendConstantColorGreen` | `double` | `0` | Green component for constant blend color. | | `BlendConstantColorRed` | `double` | `0` | Red component for constant blend color. | | `BlendEquationAlpha` | `GLVideoMixerBlendEquation` | `Add` | Blend equation for the alpha channel. | | `BlendEquationRGB` | `GLVideoMixerBlendEquation` | `Add` | Blend equation for RGB channels. | | `BlendFunctionDestinationAlpha` | `GLVideoMixerBlendFunction` | `OneMinusSourceAlpha` | Blend function for destination alpha. | | `BlendFunctionDesctinationRGB` | `GLVideoMixerBlendFunction` | `OneMinusSourceAlpha` | Blend function for destination RGB. | | `BlendFunctionSourceAlpha` | `GLVideoMixerBlendFunction` | `One` | Blend function for source alpha. | | `BlendFunctionSourceRGB` | `GLVideoMixerBlendFunction` | `SourceAlpha` | Blend function for source RGB. | **Constructor:** - `GLVideoMixerStream(Rect rectangle, uint zorder, double alpha = 1.0)` ### Blend Equation (`GLVideoMixerBlendEquation` Enum) Specifies how source and destination colors are combined during blending. | Value | Description | |-------------------|-------------------------------------------------| | `Add` | Source + Destination | | `Subtract` | Source - Destination | | `ReverseSubtract` | Destination - Source | ### Blend Function (`GLVideoMixerBlendFunction` Enum) Defines factors for source and destination colors in blending operations. (Rs, Gs, Bs, As are source color components; Rd, Gd, Bd, Ad are destination; Rc, Gc, Bc, Ac are constant color components). | Value | Description | |----------------------------|---------------------------------------------| | `Zero` | Factor is (0, 0, 0, 0). | | `One` | Factor is (1, 1, 1, 1). | | `SourceColor` | Factor is (Rs, Gs, Bs, As). | | `OneMinusSourceColor` | Factor is (1-Rs, 1-Gs, 1-Bs, 1-As). | | `DestinationColor` | Factor is (Rd, Gd, Bd, Ad). | | `OneMinusDestinationColor` | Factor is (1-Rd, 1-Gd, 1-Bd, 1-Ad). | | `SourceAlpha` | Factor is (As, As, As, As). | | `OneMinusSourceAlpha` | Factor is (1-As, 1-As, 1-As, 1-As). | | `DestinationAlpha` | Factor is (Ad, Ad, Ad, Ad). | | `OneMinusDestinationAlpha` | Factor is (1-Ad, 1-Ad, 1-Ad, 1-Ad). | | `ConstantColor` | Factor is (Rc, Gc, Bc, Ac). | | `OneMinusContantColor` | Factor is (1-Rc, 1-Gc, 1-Bc, 1-Ac). | | `ConstantAlpha` | Factor is (Ac, Ac, Ac, Ac). | | `OneMinusContantAlpha` | Factor is (1-Ac, 1-Ac, 1-Ac, 1-Ac). | | `SourceAlphaSaturate` | Factor is (min(As, 1-Ad), min(As, 1-Ad), min(As, 1-Ad), 1). | ## Virtual Test Sources for OpenGL These settings classes are used to configure virtual sources that generate test patterns directly within an OpenGL context. ### Virtual Video Source Settings (`GLVirtualVideoSourceSettings`) Configures a source block (`GLVirtualVideoSourceBlock`) that produces test video data. Implements `IMediaPlayerBaseSourceSettings` and `IVideoCaptureBaseVideoSourceSettings`. **Properties:** | Property | Type | Default | Description | |-------------|----------------------------|------------------------|--------------------------------------------------| | `Width` | `int` | `1280` | Width of the output video. | | `Height` | `int` | `720` | Height of the output video. | | `FrameRate` | `VideoFrameRate` | `30/1` (30 fps) | Frame rate of the output video. | | `IsLive` | `bool` | `true` | Indicates if the source is live. | | `Mode` | `GLVirtualVideoSourceMode` | (N/A - must be set) | Specifies the type of test pattern to generate. | **Associated Enum: `GLVirtualVideoSourceMode`** Defines the test pattern generated by `GLVirtualVideoSourceBlock`. | Value | Description | |---------------|------------------------------| | `SMPTE` | SMPTE 100% color bars. | | `Snow` | Random (television snow). | | `Black` | 100% Black. | | `White` | 100% White. | | `Red` | Solid Red color. | | `Green` | Solid Green color. | | `Blue` | Solid Blue color. | | `Checkers1` | Checkerboard pattern (1px). | | `Checkers2` | Checkerboard pattern (2px). | | `Checkers4` | Checkerboard pattern (4px). | | `Checkers8` | Checkerboard pattern (8px). | | `Circular` | Circular pattern. | | `Blink` | Blinking pattern. | | `Mandelbrot` | Mandelbrot fractal. | **Methods:** - `Task ReadInfoAsync()`: Asynchronously reads media information (returns synthetic info based on settings). - `MediaBlock CreateBlock()`: Creates a `GLVirtualVideoSourceBlock` instance configured with these settings. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Outputs\index.md --- title: .Net Media Output Blocks Guide description: Explore a complete guide to .Net Media SDK output blocks. Learn about file and network sinks for your media processing pipelines. sidebar_label: Outputs --- # Output Blocks - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Output blocks, also known as sinks, are responsible for writing media data to files, network streams, or other destinations. They are typically the last blocks in any media processing chain. VisioForge Media Blocks SDK .Net provides a comprehensive collection of output blocks for various formats and protocols. This guide covers file output blocks like MP4, AVI, WebM, MKV, and network streaming blocks for protocols such as RTMP (used by YouTube and Facebook Live). ## AVI Output Block The `AVIOutputBlock` is used to create AVI files. It combines video and audio encoders with a file sink to produce `.avi` files. ### Block info Name: `AVIOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Video | various | H264 (default), other `IVideoEncoder` compatible encoders | | Input Audio | various | AAC (default), MP3, other `IAudioEncoder` compatible encoders | ### Settings The `AVIOutputBlock` is configured using `AVISinkSettings` along with settings for the chosen video and audio encoders (e.g., `IH264EncoderSettings` and `IAACEncoderSettings` or `MP3EncoderSettings`). Key `AVISinkSettings` properties: - `Filename` (string): The path to the output AVI file. Constructors: - `AVIOutputBlock(string filename)`: Uses default H264 video and AAC audio encoders. - `AVIOutputBlock(AVISinkSettings sinkSettings, IH264EncoderSettings h264settings, IAACEncoderSettings aacSettings)`: Uses specified H264 video and AAC audio encoders. - `AVIOutputBlock(AVISinkSettings sinkSettings, IH264EncoderSettings h264settings, MP3EncoderSettings mp3Settings)`: Uses specified H264 video and MP3 audio encoders. ### The sample pipeline ```mermaid graph LR; VideoSource-->VideoEncoder; AudioSource-->AudioEncoder; VideoEncoder-->AVIOutputBlock; AudioEncoder-->AVIOutputBlock; ``` Or, if using a source that provides encoded data, or if the `AVIOutputBlock` handles internal encoders based on settings: ```mermaid graph LR; UniversalSourceBlock--Video Output-->AVIOutputBlock; UniversalSourceBlock--Audio Output-->AVIOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create video source (example: virtual source) var videoSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); // create audio source (example: virtual source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // create AVI output block // This constructor uses default H264 video and AAC audio encoders internally. var aviOutput = new AVIOutputBlock("output.avi"); // Create inputs for the AVI output block var videoInputPad = aviOutput.CreateNewInput(MediaBlockPadMediaType.Video); var audioInputPad = aviOutput.CreateNewInput(MediaBlockPadMediaType.Audio); // connect video path pipeline.Connect(videoSource.Output, videoInputPad); // connect audio path pipeline.Connect(audioSource.Output, audioInputPad); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks The `AVIOutputBlock` internally manages encoder instances (like `H264Encoder` and `AACEncoder` or `MP3Encoder`) based on the provided settings. Ensure the necessary GStreamer plugins and SDK components for these encoders and the AVI muxer are available. To check availability: `AVIOutputBlock.IsAvailable(IH264EncoderSettings h264settings, IAACEncoderSettings aacSettings)` ### Platforms Primarily Windows. Availability on other platforms depends on GStreamer plugin support for AVI muxing and the chosen encoders. ## Facebook Live Output Block The `FacebookLiveOutputBlock` is designed to stream video and audio to Facebook Live using RTMP. It internally uses H.264 video and AAC audio encoders. ### Block info Name: `FacebookLiveOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Video | various | H.264 (internal) | | Input Audio | various | AAC (internal) | ### Settings The `FacebookLiveOutputBlock` is configured using `FacebookLiveSinkSettings`, `IH264EncoderSettings`, and `IAACEncoderSettings`. Key `FacebookLiveSinkSettings` properties: - `Url` (string): The RTMP URL provided by Facebook Live for streaming. Constructor: - `FacebookLiveOutputBlock(FacebookLiveSinkSettings sinkSettings, IH264EncoderSettings h264settings, IAACEncoderSettings aacSettings)` ### The sample pipeline ```mermaid graph LR; VideoSource-->FacebookLiveOutputBlock; AudioSource-->FacebookLiveOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create video source (e.g., SystemVideoSourceBlock) var videoSource = new SystemVideoSourceBlock(videoSourceSettings); // Assuming videoSourceSettings are configured // create audio source (e.g., SystemAudioSourceBlock) var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // Assuming audioSourceSettings are configured // configure Facebook Live sink settings var fbSinkSettings = new FacebookLiveSinkSettings("rtmp://your-facebook-live-url/your-stream-key"); // configure H.264 encoder settings (use defaults or customize) var h264Settings = H264EncoderBlock.GetDefaultSettings(); h264Settings.Bitrate = 4000000; // Example: 4 Mbps // configure AAC encoder settings (use defaults or customize) var aacSettings = AACEncoderBlock.GetDefaultSettings(); aacSettings.Bitrate = 128000; // Example: 128 Kbps // create Facebook Live output block var facebookOutput = new FacebookLiveOutputBlock(fbSinkSettings, h264Settings, aacSettings); // Create inputs for the Facebook Live output block var videoInputPad = facebookOutput.CreateNewInput(MediaBlockPadMediaType.Video); var audioInputPad = facebookOutput.CreateNewInput(MediaBlockPadMediaType.Audio); // connect video path pipeline.Connect(videoSource.Output, videoInputPad); // connect audio path pipeline.Connect(audioSource.Output, audioInputPad); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks This block encapsulates the necessary H.264 and AAC encoders and the RTMP sink (`FacebookLiveSink`). Ensure that the `FacebookLiveSink`, `H264Encoder`, and `AACEncoder` are available. `FacebookLiveOutputBlock.IsAvailable()` can be used to check this (though the provided source shows `FacebookLiveSink.IsAvailable()`). ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer RTMP support and encoder availability). ## FLAC Output Block The `FLACOutputBlock` is used for creating FLAC (Free Lossless Audio Codec) audio files. It takes uncompressed audio data, encodes it using a FLAC encoder, and saves it to a `.flac` file. ### Block info Name: `FLACOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Audio | uncompressed audio | FLAC (internal) | ### Settings The `FLACOutputBlock` is configured with a filename and `FLACEncoderSettings`. Key `FLACEncoderSettings` properties (refer to `FLACEncoderSettings` documentation for full details): - Quality level, compression level, etc. Constructor: - `FLACOutputBlock(string filename, FLACEncoderSettings settings)` ### The sample pipeline ```mermaid graph LR; AudioSource-->FLACOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source (example: virtual audio source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // configure FLAC encoder settings var flacSettings = new FLACEncoderSettings(); // flacSettings.Quality = 8; // Example: Set quality level (0-8, default is 5) // create FLAC output block var flacOutput = new FLACOutputBlock("output.flac", flacSettings); // connect audio path pipeline.Connect(audioSource.Output, flacOutput.Input); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks This block combines a `FLACEncoder` and a `FileSink` internally. To check if the block and its dependencies are available: `FLACOutputBlock.IsAvailable()` (This checks for `FLACEncoder` and `FileSink` availability). ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer FLAC encoder and file sink support). ## M4A Output Block The `M4AOutputBlock` creates M4A (MPEG-4 Audio) files, commonly used for AAC encoded audio. It uses an AAC audio encoder and an MP4 sink to produce `.m4a` files. ### Block info Name: `M4AOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Audio | various | AAC (internal) | ### Settings The `M4AOutputBlock` is configured using `MP4SinkSettings` and `IAACEncoderSettings`. Key `MP4SinkSettings` properties: - `Filename` (string): The path to the output M4A file. Key `IAACEncoderSettings` properties (refer to `AACEncoderSettings` for details): - Bitrate, profile, etc. Constructors: - `M4AOutputBlock(string filename)`: Uses default AAC encoder settings. - `M4AOutputBlock(MP4SinkSettings sinkSettings, IAACEncoderSettings aacSettings)`: Uses specified AAC encoder settings. ### The sample pipeline ```mermaid graph LR; AudioSource-->M4AOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source (example: virtual audio source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // configure M4A output block with default AAC settings var m4aOutput = new M4AOutputBlock("output.m4a"); // Or, with custom AAC settings: // var sinkSettings = new MP4SinkSettings("output.m4a"); // var aacSettings = AACEncoderBlock.GetDefaultSettings(); // aacSettings.Bitrate = 192000; // Example: 192 Kbps // var m4aOutput = new M4AOutputBlock(sinkSettings, aacSettings); // Create input for the M4A output block var audioInputPad = m4aOutput.CreateNewInput(MediaBlockPadMediaType.Audio); // connect audio path pipeline.Connect(audioSource.Output, audioInputPad); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks The `M4AOutputBlock` internally manages an `AACEncoder` and an `MP4Sink`. To check availability: `M4AOutputBlock.IsAvailable(IAACEncoderSettings aacSettings)` ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer MP4 muxer and AAC encoder support). ## MKV Output Block The `MKVOutputBlock` is used to create Matroska (MKV) files. MKV is a flexible container format that can hold various video, audio, and subtitle streams. This block combines specified video and audio encoders with an MKV sink. ### Block info Name: `MKVOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Video | various | `IVideoEncoder` (e.g., H.264, HEVC, VPX, AV1) | | Input Audio | various | `IAudioEncoder` (e.g., AAC, MP3, Vorbis, Opus, Speex) | ### Settings The `MKVOutputBlock` is configured using `MKVSinkSettings`, along with settings for the chosen video (`IVideoEncoder`) and audio (`IAudioEncoder`) encoders. Key `MKVSinkSettings` properties: - `Filename` (string): The path to the output MKV file. Constructors: - `MKVOutputBlock(MKVSinkSettings sinkSettings, IVideoEncoder videoSettings, IAudioEncoder audioSettings)` ### The sample pipeline ```mermaid graph LR; VideoSource-->VideoEncoder; AudioSource-->AudioEncoder; VideoEncoder-->MKVOutputBlock; AudioEncoder-->MKVOutputBlock; ``` More directly, if `MKVOutputBlock` handles encoder instantiation internally: ```mermaid graph LR; VideoSource-->MKVOutputBlock; AudioSource-->MKVOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create video source (example: virtual source) var videoSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); // create audio source (example: virtual source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // configure MKV sink settings var mkvSinkSettings = new MKVSinkSettings("output.mkv"); // configure video encoder (example: H.264) var h264Settings = H264EncoderBlock.GetDefaultSettings(); // h264Settings.Bitrate = 5000000; // Example // configure audio encoder (example: AAC) var aacSettings = AACEncoderBlock.GetDefaultSettings(); // aacSettings.Bitrate = 128000; // Example // create MKV output block var mkvOutput = new MKVOutputBlock(mkvSinkSettings, h264Settings, aacSettings); // Create inputs for the MKV output block var videoInputPad = mkvOutput.CreateNewInput(MediaBlockPadMediaType.Video); var audioInputPad = mkvOutput.CreateNewInput(MediaBlockPadMediaType.Audio); // connect video path pipeline.Connect(videoSource.Output, videoInputPad); // connect audio path pipeline.Connect(audioSource.Output, audioInputPad); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks The `MKVOutputBlock` internally manages the specified video and audio encoder instances (e.g., `H264Encoder`, `HEVCEncoder`, `AACEncoder`, `VorbisEncoder`, etc.) and an `MKVSink`. Supported video encoders include H.264, HEVC, VPX (VP8/VP9), AV1. Supported audio encoders include AAC, MP3, Vorbis, Opus, Speex. To check availability (example with H.264 and AAC): `MKVOutputBlock.IsAvailable(IH264EncoderSettings h264settings, IAACEncoderSettings aacSettings)` ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer MKV muxer and chosen encoder support). ## MP3 Output Block The `MP3OutputBlock` is used for creating MP3 audio files. It encodes uncompressed audio data using an MP3 encoder and saves it to an `.mp3` file. ### Block info Name: `MP3OutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Audio | uncompressed audio | MP3 (internal) | ### Settings The `MP3OutputBlock` is configured with a filename and `MP3EncoderSettings`. Key `MP3EncoderSettings` properties (refer to `MP3EncoderSettings` documentation for full details): - Bitrate, quality, channel mode, etc. Constructor: - `MP3OutputBlock(string filename, MP3EncoderSettings mp3Settings)` ### The sample pipeline ```mermaid graph LR; AudioSource-->MP3OutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source (example: virtual audio source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // configure MP3 encoder settings var mp3Settings = new MP3EncoderSettings(); // mp3Settings.Bitrate = 192; // Example: Set bitrate to 192 kbps // mp3Settings.Quality = MP3Quality.Best; // Example: Set quality // create MP3 output block var mp3Output = new MP3OutputBlock("output.mp3", mp3Settings); // connect audio path pipeline.Connect(audioSource.Output, mp3Output.Input); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks This block combines an `MP3Encoder` and a `FileSink` internally. To check if the block and its dependencies are available: `MP3OutputBlock.IsAvailable()` (This checks for `MP3Encoder` and `FileSink` availability). ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer MP3 encoder (e.g., LAME) and file sink support). ## MP4 Output Block The `MP4OutputBlock` is used for creating MP4 files. It can combine various video and audio encoders with an MP4 sink to produce `.mp4` files. ### Block info Name: `MP4OutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Video | various | `IVideoEncoder` (e.g., H.264, HEVC) | | Input Audio | various | `IAudioEncoder` (e.g., AAC, MP3) | ### Settings The `MP4OutputBlock` is configured using `MP4SinkSettings`, along with settings for the chosen video (`IVideoEncoder`, typically `IH264EncoderSettings` or `IHEVCEncoderSettings`) and audio (`IAudioEncoder`, typically `IAACEncoderSettings` or `MP3EncoderSettings`) encoders. Key `MP4SinkSettings` properties: - `Filename` (string): The path to the output MP4 file. - Can also be `MP4SplitSinkSettings` for recording in segments. Constructors: - `MP4OutputBlock(string filename)`: Uses default H.264 video and AAC audio encoders. - `MP4OutputBlock(MP4SinkSettings sinkSettings, IH264EncoderSettings h264settings, IAACEncoderSettings aacSettings)` - `MP4OutputBlock(MP4SinkSettings sinkSettings, IH264EncoderSettings h264settings, MP3EncoderSettings mp3Settings)` - `MP4OutputBlock(MP4SinkSettings sinkSettings, IHEVCEncoderSettings hevcSettings, IAACEncoderSettings aacSettings)` - `MP4OutputBlock(MP4SinkSettings sinkSettings, IHEVCEncoderSettings hevcSettings, MP3EncoderSettings mp3Settings)` ### The sample pipeline ```mermaid graph LR; VideoSource-->VideoEncoder; AudioSource-->AudioEncoder; VideoEncoder-->MP4OutputBlock; AudioEncoder-->MP4OutputBlock; ``` If `MP4OutputBlock` uses its default internal encoders: ```mermaid graph LR; VideoSource-->MP4OutputBlock; AudioSource-->MP4OutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create video source (example: virtual source) var videoSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); // create audio source (example: virtual source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // create MP4 output block with default H.264 video and AAC audio encoders var mp4Output = new MP4OutputBlock("output.mp4"); // Or, with custom H.264 and AAC settings: // var sinkSettings = new MP4SinkSettings("output.mp4"); // var h264Settings = H264EncoderBlock.GetDefaultSettings(); // h264Settings.Bitrate = 8000000; // Example: 8 Mbps // var aacSettings = AACEncoderBlock.GetDefaultSettings(); // aacSettings.Bitrate = 192000; // Example: 192 Kbps // var mp4Output = new MP4OutputBlock(sinkSettings, h264Settings, aacSettings); // Create inputs for the MP4 output block var videoInputPad = mp4Output.CreateNewInput(MediaBlockPadMediaType.Video); var audioInputPad = mp4Output.CreateNewInput(MediaBlockPadMediaType.Audio); // connect video path pipeline.Connect(videoSource.Output, videoInputPad); // connect audio path pipeline.Connect(audioSource.Output, audioInputPad); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks The `MP4OutputBlock` internally manages video (e.g., `H264Encoder`, `HEVCEncoder`) and audio (e.g., `AACEncoder`, `MP3Encoder`) encoder instances along with an `MP4Sink`. To check availability (example with H.264 and AAC): `MP4OutputBlock.IsAvailable(IH264EncoderSettings h264settings, IAACEncoderSettings aacSettings)` ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer MP4 muxer and chosen encoder support). ## OGG Opus Output Block The `OGGOpusOutputBlock` is used for creating Ogg Opus audio files. It encodes uncompressed audio data using an Opus encoder and multiplexes it into an Ogg container, saving to an `.opus` or `.ogg` file. ### Block info Name: `OGGOpusOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Audio | uncompressed audio | Opus (internal) | ### Settings The `OGGOpusOutputBlock` is configured with a filename and `OPUSEncoderSettings`. Key `OPUSEncoderSettings` properties (refer to `OPUSEncoderSettings` documentation for full details): - Bitrate, complexity, frame duration, audio type (voice/music), etc. Constructor: - `OGGOpusOutputBlock(string filename, OPUSEncoderSettings settings)` ### The sample pipeline ```mermaid graph LR; AudioSource-->OGGOpusOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source (example: virtual audio source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // configure Opus encoder settings var opusSettings = new OPUSEncoderSettings(); // opusSettings.Bitrate = 64000; // Example: Set bitrate to 64 kbps // opusSettings.AudioType = OpusEncoderAudioType.Music; // Example // create OGG Opus output block var oggOpusOutput = new OGGOpusOutputBlock("output.opus", opusSettings); // connect audio path pipeline.Connect(audioSource.Output, oggOpusOutput.Input); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks This block combines an `OPUSEncoder` and an `OGGSink` internally. To check if the block and its dependencies are available: `OGGOpusOutputBlock.IsAvailable()` (This checks for `OGGSink`, `OPUSEncoder`, and `FileSink` - though `FileSink` might be implicitly part of `OGGSink` logic for file output). ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer Ogg muxer and Opus encoder support). ## OGG Speex Output Block The `OGGSpeexOutputBlock` is used for creating Ogg Speex audio files, typically for voice. It encodes uncompressed audio data using a Speex encoder, multiplexes it into an Ogg container, and saves to an `.spx` or `.ogg` file. ### Block info Name: `OGGSpeexOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Audio | uncompressed audio | Speex (internal) | ### Settings The `OGGSpeexOutputBlock` is configured with a filename and `SpeexEncoderSettings`. Key `SpeexEncoderSettings` properties (refer to `SpeexEncoderSettings` documentation for full details): - Quality, complexity, encoding mode (VBR/ABR/CBR), etc. Constructor: - `OGGSpeexOutputBlock(string filename, SpeexEncoderSettings settings)` ### The sample pipeline ```mermaid graph LR; AudioSource-->OGGSpeexOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source (example: virtual audio source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // configure Speex encoder settings var speexSettings = new SpeexEncoderSettings(); // speexSettings.Quality = 8; // Example: Set quality (0-10) // speexSettings.Mode = SpeexEncoderMode.VBR; // Example: Use Variable Bitrate // create OGG Speex output block var oggSpeexOutput = new OGGSpeexOutputBlock("output.spx", speexSettings); // connect audio path pipeline.Connect(audioSource.Output, oggSpeexOutput.Input); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks This block combines a `SpeexEncoder` and an `OGGSink` internally. To check if the block and its dependencies are available: `OGGSpeexOutputBlock.IsAvailable()` (This checks for `OGGSink`, `SpeexEncoder`, and `FileSink` - `FileSink` might be implicit to `OGGSink` for file output). ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer Ogg muxer and Speex encoder support). ## OGG Vorbis Output Block The `OGGVorbisOutputBlock` is used for creating Ogg Vorbis audio files. It encodes uncompressed audio data using a Vorbis encoder, multiplexes it into an Ogg container, and saves to an `.ogg` file. ### Block info Name: `OGGVorbisOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Audio | uncompressed audio | Vorbis (internal) | ### Settings The `OGGVorbisOutputBlock` is configured with a filename and `VorbisEncoderSettings`. Key `VorbisEncoderSettings` properties (refer to `VorbisEncoderSettings` documentation for full details): - Quality, bitrate, managed/unmanaged bitrate settings, etc. Constructor: - `OGGVorbisOutputBlock(string filename, VorbisEncoderSettings settings)` ### The sample pipeline ```mermaid graph LR; AudioSource-->OGGVorbisOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source (example: virtual audio source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // configure Vorbis encoder settings var vorbisSettings = new VorbisEncoderSettings(); // vorbisSettings.Quality = 0.8f; // Example: Set quality (0.0 to 1.0) // vorbisSettings.Bitrate = 128000; // Example if not using quality based encoding // create OGG Vorbis output block var oggVorbisOutput = new OGGVorbisOutputBlock("output.ogg", vorbisSettings); // connect audio path pipeline.Connect(audioSource.Output, oggVorbisOutput.Input); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks This block combines a `VorbisEncoder` and an `OGGSink` internally. To check if the block and its dependencies are available: `OGGVorbisOutputBlock.IsAvailable()` (This checks for `OGGSink`, `VorbisEncoder`, and `FileSink` - `FileSink` might be implicit to `OGGSink` for file output). ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer Ogg muxer and Vorbis encoder support). ## WebM Output Block The `WebMOutputBlock` is used for creating WebM files, typically containing VP8 or VP9 video and Vorbis audio. It combines a VPX video encoder and a Vorbis audio encoder with a WebM sink. ### Block info Name: `WebMOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Video | various | VPX (VP8/VP9 - internal) | | Input Audio | various | Vorbis (internal) | ### Settings The `WebMOutputBlock` is configured using `WebMSinkSettings`, `IVPXEncoderSettings` (for VP8 or VP9), and `VorbisEncoderSettings`. Key `WebMSinkSettings` properties: - `Filename` (string): The path to the output WebM file. Key `IVPXEncoderSettings` properties (refer to `VPXEncoderSettings` for details): - Bitrate, quality, speed, threads, etc. Key `VorbisEncoderSettings` properties: - Quality, bitrate, etc. Constructor: - `WebMOutputBlock(WebMSinkSettings sinkSettings, IVPXEncoderSettings videoEncoderSettings, VorbisEncoderSettings vorbisSettings)` ### The sample pipeline ```mermaid graph LR; VideoSource-->WebMOutputBlock; AudioSource-->WebMOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create video source (example: virtual source) var videoSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); // create audio source (example: virtual source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // configure WebM sink settings var webmSinkSettings = new WebMSinkSettings("output.webm"); // configure VPX encoder settings (example: VP9) var vp9Settings = new VPXEncoderSettings(VPXEncoderMode.VP9); // vp9Settings.Bitrate = 2000000; // Example: 2 Mbps // vp9Settings.Speed = VP9Speed.Fast; // Example // configure Vorbis encoder settings var vorbisSettings = new VorbisEncoderSettings(); // vorbisSettings.Quality = 0.7f; // Example: Set quality // create WebM output block var webmOutput = new WebMOutputBlock(webmSinkSettings, vp9Settings, vorbisSettings); // Create inputs for the WebM output block var videoInputPad = webmOutput.CreateNewInput(MediaBlockPadMediaType.Video); var audioInputPad = webmOutput.CreateNewInput(MediaBlockPadMediaType.Audio); // connect video path pipeline.Connect(videoSource.Output, videoInputPad); // connect audio path pipeline.Connect(audioSource.Output, audioInputPad); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks The `WebMOutputBlock` internally manages a `VPXEncoder` (for VP8/VP9), a `VorbisEncoder`, and a `WebMSink`. To check availability: `WebMOutputBlock.IsAvailable(IVPXEncoderSettings videoEncoderSettings)` ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer WebM muxer, VPX encoder, and Vorbis encoder support). ## Separate Output Block The `SeparateOutputBlock` provides a flexible way to configure custom output pipelines, allowing you to specify distinct video and audio encoders, processors, and a final writer/sink. It uses bridge sources (`BridgeVideoSourceBlock`, `BridgeAudioSourceBlock`) to tap into the main pipeline, enabling recording independently from preview or other processing chains. ### Block info Name: `SeparateOutputBlock`. This block itself doesn't have direct input pads in the traditional sense; it orchestrates a sub-pipeline. ### Settings The `SeparateOutputBlock` is configured using the `SeparateOutput` settings object. Key `SeparateOutput` properties: - `Sink` (`MediaBlock`): The final sink/muxer for the output (e.g., `MP4OutputBlock`, `FileSink`). Must implement `IMediaBlockDynamicInputs` if separate encoders are used, or `IMediaBlockSinkAllInOne` if it handles encoding internally. - `VideoEncoder` (`MediaBlock`): An optional video encoder block. - `AudioEncoder` (`MediaBlock`): An optional audio encoder block. - `VideoProcessor` (`MediaBlock`): An optional video processing block to insert before the video encoder. - `AudioProcessor` (`MediaBlock`): An optional audio processing block to insert before the audio encoder. - `Writer` (`MediaBlock`): An optional writer block that takes the output of the `Sink` (e.g., for custom file writing or network streaming logic if the `Sink` is just a muxer). - `GetFilename()`: Method to retrieve the configured output filename if applicable. Constructor: - `SeparateOutputBlock(MediaBlocksPipeline pipeline, SeparateOutput settings, BridgeVideoSourceSettings bridgeVideoSourceSettings, BridgeAudioSourceSettings bridgeAudioSourceSettings)` ### The conceptual pipeline This block creates an independent processing branch. For video: ```mermaid graph LR; MainVideoPath --> BridgeVideoSink; BridgeVideoSourceBlock --> OptionalVideoProcessor --> VideoEncoder --> SinkOrWriter; ``` For audio: ```mermaid graph LR; MainAudioPath --> BridgeAudioSink; BridgeAudioSourceBlock --> OptionalAudioProcessor --> AudioEncoder --> SinkOrWriter; ``` ### Sample code ```csharp // Assuming 'pipeline' is your main MediaBlocksPipeline // Assuming 'mainVideoSourceOutputPad' and 'mainAudioSourceOutputPad' are outputs from your main sources // 1. Configure Bridge Sinks in your main pipeline var bridgeVideoSinkSettings = new BridgeVideoSinkSettings("sep_video_bridge"); var bridgeVideoSink = new BridgeVideoSinkBlock(bridgeVideoSinkSettings); pipeline.Connect(mainVideoSourceOutputPad, bridgeVideoSink.Input); var bridgeAudioSinkSettings = new BridgeAudioSinkSettings("sep_audio_bridge"); var bridgeAudioSink = new BridgeAudioSinkBlock(bridgeAudioSinkSettings); pipeline.Connect(mainAudioSourceOutputPad, bridgeAudioSink.Input); // 2. Configure Bridge Sources for the SeparateOutputBlock's sub-pipeline var bridgeVideoSourceSettings = new BridgeVideoSourceSettings("sep_video_bridge"); var bridgeAudioSourceSettings = new BridgeAudioSourceSettings("sep_audio_bridge"); // 3. Configure encoders and sink for the SeparateOutput var h264Settings = H264EncoderBlock.GetDefaultSettings(); var videoEncoder = new H264EncoderBlock(h264Settings); var aacSettings = AACEncoderBlock.GetDefaultSettings(); var audioEncoder = new AACEncoderBlock(aacSettings); var mp4SinkSettings = new MP4SinkSettings("separate_output.mp4"); var mp4Sink = new MP4OutputBlock(mp4SinkSettings, h264Settings, aacSettings); // Using MP4OutputBlock which handles muxing. // Alternatively, use a raw MP4Sink and connect encoders to it. // 4. Configure SeparateOutput settings var separateOutputSettings = new SeparateOutput( sink: mp4Sink, // mp4Sink will act as the final writer here videoEncoder: videoEncoder, // This is somewhat redundant if mp4Sink is MP4OutputBlock with encoders audioEncoder: audioEncoder // Same as above. Better to use a raw sink if providing encoders separately ); // A more typical setup if mp4Sink is just a muxer (e.g., new MP4Sink(mp4SinkRawSettings)): // var separateOutputSettings = new SeparateOutput( // sink: rawMp4Muxer, // videoEncoder: videoEncoder, // audioEncoder: audioEncoder // ); // 5. Create the SeparateOutputBlock (this will internally connect its components) var separateOutput = new SeparateOutputBlock(pipeline, separateOutputSettings, bridgeVideoSourceSettings, bridgeAudioSourceSettings); // 6. Build the sources, encoders, and sink used by SeparateOutputBlock // Note: Building these might be handled by the pipeline if they are added to it, // or might need to be done explicitly if they are part of a sub-graph not directly in the main pipeline's block list. // The SeparateOutputBlock's Build() method will handle building its internal sources (_videoSource, _audioSource) // and the provided encoders/sink if they haven't been built. // pipeline.Add(bridgeVideoSink); // pipeline.Add(bridgeAudioSink); // pipeline.Add(separateOutput); // Add the orchestrator block // Start main pipeline // await pipeline.StartAsync(); // This will also start the separate output processing via bridges // To change filename later: // separateOutput.SetFilenameOrURL("new_separate_output.mp4"); ``` ### Remarks The `SeparateOutputBlock` itself is more of an orchestrator for a sub-pipeline that's fed by bridge sinks/sources from the main pipeline. It allows for complex recording or streaming configurations that can be started/stopped or modified independently to some extent. The `VideoEncoder`, `AudioEncoder`, `Sink`, and `Writer` components must be built correctly. The `SeparateOutputBlock.Build()` method attempts to build these components. ### Platforms Depends on the components used within the `SeparateOutput` configuration (encoders, sinks, processors). Generally cross-platform if GStreamer elements are available. ## WMV Output Block The `WMVOutputBlock` is used for creating Windows Media Video (WMV) files. It uses WMV video (`WMVEncoder`) and WMA audio (`WMAEncoder`) encoders with an ASF (Advanced Systems Format) sink to produce `.wmv` files. ### Block info Name: `WMVOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Video | various | WMV (internal) | | Input Audio | various | WMA (internal) | ### Settings The `WMVOutputBlock` is configured using `ASFSinkSettings`, `WMVEncoderSettings`, and `WMAEncoderSettings`. Key `ASFSinkSettings` properties: - `Filename` (string): The path to the output WMV file. Key `WMVEncoderSettings` properties (refer to `WMVEncoderSettings` documentation): - Bitrate, GOP size, quality, etc. Key `WMAEncoderSettings` properties (refer to `WMAEncoderSettings` documentation): - Bitrate, WMA version, etc. Constructors: - `WMVOutputBlock(string filename)`: Uses default WMV video and WMA audio encoder settings. - `WMVOutputBlock(ASFSinkSettings sinkSettings, WMVEncoderSettings videoSettings, WMAEncoderSettings audioSettings)`: Uses specified encoder settings. ### The sample pipeline ```mermaid graph LR; VideoSource-->WMVOutputBlock; AudioSource-->WMVOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create video source (example: virtual source) var videoSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); // create audio source (example: virtual source) var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); // create WMV output block with default settings var wmvOutput = new WMVOutputBlock("output.wmv"); // Or, with custom settings: // var asfSinkSettings = new ASFSinkSettings("output.wmv"); // var wmvEncSettings = WMVEncoderBlock.GetDefaultSettings(); // wmvEncSettings.Bitrate = 3000000; // Example: 3 Mbps // var wmaEncSettings = WMAEncoderBlock.GetDefaultSettings(); // wmaEncSettings.Bitrate = 160000; // Example: 160 Kbps // var wmvOutput = new WMVOutputBlock(asfSinkSettings, wmvEncSettings, wmaEncSettings); // Create inputs for the WMV output block var videoInputPad = wmvOutput.CreateNewInput(MediaBlockPadMediaType.Video); var audioInputPad = wmvOutput.CreateNewInput(MediaBlockPadMediaType.Audio); // connect video path pipeline.Connect(videoSource.Output, videoInputPad); // connect audio path pipeline.Connect(audioSource.Output, audioInputPad); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks The `WMVOutputBlock` internally manages `WMVEncoder`, `WMAEncoder`, and `ASFSink`. To check availability: `WMVOutputBlock.IsAvailable()` ### Platforms Primarily Windows. Availability on other platforms depends on GStreamer plugin support for ASF muxing, WMV, and WMA encoders (which may be limited outside of Windows). ## YouTube Output Block The `YouTubeOutputBlock` is designed for streaming video and audio to YouTube Live using RTMP. It internally utilizes H.264 video and AAC audio encoders. ### Block info Name: `YouTubeOutputBlock`. | Pin direction | Media type | Expected Encoders | | --- | :---: | :---: | | Input Video | various | H.264 (internal) | | Input Audio | various | AAC (internal) | ### Settings The `YouTubeOutputBlock` is configured using `YouTubeSinkSettings`, `IH264EncoderSettings`, and `IAACEncoderSettings`. Key `YouTubeSinkSettings` properties: - `Url` (string): The RTMP URL provided by YouTube Live for streaming (e.g., "rtmp://a.rtmp.youtube.com/live2/YOUR-STREAM-KEY"). Constructor: - `YouTubeOutputBlock(YouTubeSinkSettings sinkSettings, IH264EncoderSettings h264settings, IAACEncoderSettings aacSettings)` ### The sample pipeline ```mermaid graph LR; VideoSource-->YouTubeOutputBlock; AudioSource-->YouTubeOutputBlock; ``` ### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create video source (e.g., SystemVideoSourceBlock) var videoSource = new SystemVideoSourceBlock(videoSourceSettings); // Assuming videoSourceSettings are configured // create audio source (e.g., SystemAudioSourceBlock) var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // Assuming audioSourceSettings are configured // configure YouTube sink settings var ytSinkSettings = new YouTubeSinkSettings("rtmp://a.rtmp.youtube.com/live2/YOUR-STREAM-KEY"); // configure H.264 encoder settings (use defaults or customize per YouTube recommendations) var h264Settings = H264EncoderBlock.GetDefaultSettings(); // h264Settings.Bitrate = 6000000; // Example: 6 Mbps for 1080p // h264Settings.UsagePreset = H264UsagePreset.None; // Adjust based on performance/quality needs // configure AAC encoder settings (use defaults or customize per YouTube recommendations) var aacSettings = AACEncoderBlock.GetDefaultSettings(); // aacSettings.Bitrate = 128000; // Example: 128 Kbps stereo // create YouTube output block var youTubeOutput = new YouTubeOutputBlock(ytSinkSettings, h264Settings, aacSettings); // Create inputs for the YouTube output block var videoInputPad = youTubeOutput.CreateNewInput(MediaBlockPadMediaType.Video); var audioInputPad = youTubeOutput.CreateNewInput(MediaBlockPadMediaType.Audio); // connect video path pipeline.Connect(videoSource.Output, videoInputPad); // connect audio path pipeline.Connect(audioSource.Output, audioInputPad); // start pipeline await pipeline.StartAsync(); // ... later, to stop ... // await pipeline.StopAsync(); ``` ### Remarks This block encapsulates the H.264 and AAC encoders and the RTMP sink (`YouTubeSink`). Ensure that the `YouTubeSink`, `H264Encoder`, and `AACEncoder` are available. `YouTubeOutputBlock.IsAvailable(IH264EncoderSettings h264settings, IAACEncoderSettings aacSettings)` can be used to check this. It's crucial to configure encoder settings (bitrate, resolution, frame rate) according to YouTube's recommended settings for live streaming to ensure optimal quality and compatibility. ### Platforms Windows, macOS, Linux, iOS, Android (Platform availability depends on GStreamer RTMP support and H.264/AAC encoder availability). ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Parsers\index.md --- title: .Net Media Parser Blocks Guide description: Explore a complete guide to .Net Media SDK parser blocks. Learn about various video and audio parsers for your media processing pipelines. sidebar_label: Parsers --- # Parser Blocks - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Parser blocks are essential components in media processing pipelines. They are used to parse elementary streams, which might be raw or partially processed, to extract metadata, and to prepare the streams for further processing like decoding or multiplexing. VisioForge Media Blocks SDK .Net offers a variety of parser blocks for common video and audio codecs. ## Video Parser Blocks ### AV1 Parser Block The `AV1ParseBlock` is used to parse AV1 video elementary streams. It helps in identifying frame boundaries and extracting codec-specific information. #### Block info Name: `AV1ParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | AV1 video | 1 | | Output video | AV1 video | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source (e.g., File or Network)"] --> AV1ParseBlock; AV1ParseBlock --> AV1DecoderBlock["AV1 Decoder Block"]; AV1DecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### H.263 Parser Block The `H263ParseBlock` is designed to parse H.263 video elementary streams. This is useful for older video conferencing and mobile video applications. #### Block info Name: `H263ParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | H.263 video | 1 | | Output video | H.263 video | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source"] --> H263ParseBlock; H263ParseBlock --> H263DecoderBlock["H.263 Decoder Block"]; H263DecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### H.264 Parser Block The `H264ParseBlock` parses H.264 (AVC) video elementary streams. This is one of the most widely used video codecs. The parser helps in identifying NAL units and other stream properties. #### Block info Name: `H264ParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | H.264 video | 1 | | Output video | H.264 video | 1 | #### The sample pipeline ```mermaid graph LR; PushDataSource["Push Data Source (H.264 NALUs)"] --> H264ParseBlock; H264ParseBlock --> H264DecoderBlock["H.264 Decoder Block"]; H264DecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### H.265 Parser Block The `H265ParseBlock` parses H.265 (HEVC) video elementary streams. H.265 offers better compression than H.264. The parser helps in identifying NAL units and other stream properties. #### Block info Name: `H265ParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | H.265 video | 1 | | Output video | H.265 video | 1 | #### The sample pipeline ```mermaid graph LR; PushDataSource["Push Data Source (H.265 NALUs)"] --> H265ParseBlock; H265ParseBlock --> H265DecoderBlock["H.265 Decoder Block"]; H265DecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### JPEG 2000 Parser Block The `JPEG2000ParseBlock` is used to parse JPEG 2000 video streams. JPEG 2000 is a wavelet-based compression standard that can be used for still images and video. #### Block info Name: `JPEG2000ParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | JPEG 2000 video | 1 | | Output video | JPEG 2000 video | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source"] --> JPEG2000ParseBlock; JPEG2000ParseBlock --> JPEG2000DecoderBlock["JPEG 2000 Decoder Block"]; JPEG2000DecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### MPEG-1/2 Video Parser Block The `MPEG12VideoParseBlock` parses MPEG-1 and MPEG-2 video elementary streams. These are older but still relevant video codecs, especially MPEG-2 for DVDs and broadcast. #### Block info Name: `MPEG12VideoParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | MPEG-1/2 video | 1 | | Output video | MPEG-1/2 video | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source"] --> MPEG12VideoParseBlock; MPEG12VideoParseBlock --> MPEGVideoDecoderBlock["MPEG-1/2 Decoder Block"]; MPEGVideoDecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### MPEG-4 Video Parser Block The `MPEG4ParseBlock` parses MPEG-4 Part 2 video elementary streams (often referred to as DivX/Xvid in its early forms). #### Block info Name: `MPEG4ParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | MPEG-4 video | 1 | | Output video | MPEG-4 video | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source"] --> MPEG4ParseBlock; MPEG4ParseBlock --> MPEG4DecoderBlock["MPEG-4 Decoder Block"]; MPEG4DecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### PNG Parser Block The `PNGParseBlock` is used to parse PNG image data. While PNG is primarily an image format, this parser can be useful in scenarios where PNG images are part of a stream or need to be processed within the Media Blocks pipeline. #### Block info Name: `PNGParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | PNG image data | 1 | | Output video | PNG image data | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source (PNG data)"] --> PNGParseBlock; PNGParseBlock --> PNGDecoderBlock["PNG Decoder Block"]; PNGDecoderBlock --> VideoRendererBlock["Video Renderer Block (or Image Overlay)"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### VC-1 Parser Block The `VC1ParseBlock` parses VC-1 video elementary streams. VC-1 was developed by Microsoft and was used in Blu-ray Discs and Windows Media Video. #### Block info Name: `VC1ParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | VC-1 video | 1 | | Output video | VC-1 video | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source"] --> VC1ParseBlock; VC1ParseBlock --> VC1DecoderBlock["VC-1 Decoder Block"]; VC1DecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ### VP9 Parser Block The `VP9ParseBlock` parses VP9 video elementary streams. VP9 is an open and royalty-free video coding format developed by Google, often used for web video. #### Block info Name: `VP9ParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | VP9 video | 1 | | Output video | VP9 video | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source"] --> VP9ParseBlock; VP9ParseBlock --> VP9DecoderBlock["VP9 Decoder Block"]; VP9DecoderBlock --> VideoRendererBlock["Video Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. --- ## Audio Parser Blocks ### MPEG Audio Parser Block The `MPEGAudioParseBlock` parses MPEG audio elementary streams, which includes MP1, MP2, and MP3 audio. #### Block info Name: `MPEGAudioParseBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | MPEG audio | 1 | | Output audio | MPEG audio | 1 | #### The sample pipeline ```mermaid graph LR; DataSourceBlock["Data Source (MP3 data)"] --> MPEGAudioParseBlock; MPEGAudioParseBlock --> MP3DecoderBlock["MP3 Decoder Block"]; MP3DecoderBlock --> AudioRendererBlock["Audio Renderer Block"]; ``` #### Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Sinks\index.md --- title: .Net Media Sinks - File & Network Streaming description: Discover .Net media sink blocks for saving or streaming audio/video. Learn about file sinks like MP4, MKV, AVI, and network sinks such as RTMP, HLS, SRT for versatile media output. sidebar_label: Sinks --- # Sinks [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Sinks are blocks that save or stream data. They are the last blocks in the pipeline. Optionally, some sinks can have output pins to pass data to the next block in the pipeline. SDK provides a lot of different sinks for different purposes. **File sinks** The following file sinks are available: - [ASF](#asf) - [AVI](#avi) - [File](#raw-file) - [MKV](#mkv) - [MOV](#mov) - [MP4](#mp4) - [MPEG-PS](#mpeg-ps) - [MPEG-TS](#mpeg-ts) - [MXF](#mxf) - [OGG](#ogg) - [WAV](#wav) - [WebM](#webm) **Network streaming** The following network streaming sinks are available: - [Facebook Live](#facebook-live) - [HLS](#hls) - [MJPEG over HTTP](#mjpeg-over-http) - [NDI](#ndi) - [SRT](#srt) - [SRT MPEG-TS](#srt-mpeg-ts) - [RTMP](#rtmp) - [Shoutcast](#shoutcast) - [YouTube Live](#youtube-live) ## File Sinks ### ASF `ASF (Advanced Systems Format)`: A Microsoft digital container format used to store multimedia data, designed to be platform-independent and to support scalable media types like audio and video. Use the `ASFSinkSettings` class to set the parameters. #### Block info Name: AVISinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/x-wma | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-divx | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-dv | | | | video/x-huffyuv | | | | video/x-wmv | | | | video/x-jpc | | | | video/x-vp8 | | | | image/png | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WMVEncoderBlock; UniversalSourceBlock-->WMAEncoderBlock; WMVEncoderBlock-->ASFSinkBlock; WMAEncoderBlock-->ASFSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new WMAEncoderBlock(new WMAEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new WMVEncoderBlock(new WMVEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new ASFSinkBlock(new ASFSinkSettings(@"output.wmv")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### AVI AVI (Audio Video Interleave) is a multimedia container format introduced by Microsoft. It enables simultaneous audio-with-video playback by alternating segments of audio and video data. Use the `AVISinkSettings` class to set the parameters. #### Block info Name: AVISinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-divx | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-dv | | | | video/x-huffyuv | | | | image/png | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP3EncoderBlock; UniversalSourceBlock-->DIVXEncoderBlock; MP3EncoderBlock-->AVISinkBlock; DIVXEncoderBlock-->AVISinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new DIVXEncoderBlock(new DIVXEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new AVISinkBlock(new AVISinkSettings(@"output.avi")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### RAW File Universal output to a file. This sink is used inside all other higher-level sinks, e.g. MP4Sink. It can be used to write RAW video or audio to a file. #### Block info Name: FileSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input | Any stream format | 1 | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP3EncoderBlock; MP3EncoderBlock-->AVISinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mp3EncoderBlock = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, mp3EncoderBlock.Input); var fileSinkBlock = new FileSinkBlock(@"output.mp3"); pipeline.Connect(mp3EncoderBlock.Output, fileSinkBlock.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### MKV MKV (Matroska) is an open standard free container format, similar to MP4 and AVI but with more flexibility and advanced features. Use the `MKVSinkSettings` class to set the parameters. #### Block info Name: MKVSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/x-wma | | | | audio/x-vorbis | | | | audio/x-opus | | | | audio/x-flac | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-divx | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-h265 | | | | video/x-dv | | | | video/x-huffyuv | | | | video/x-wmv | | | | video/x-jpc | | | | video/x-vp8 | | | | video/x-vp9 | | | | video/x-theora | | | | image/png | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VorbisEncoderBlock; UniversalSourceBlock-->VP9EncoderBlock; VorbisEncoderBlock-->MKVSinkBlock; VP9EncoderBlock-->MKVSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new VP9EncoderBlock(new VP9EncoderSettings() { Bitrate = 2000 }); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new MKVSinkBlock(new MKVSinkSettings(@"output.mkv")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### MOV MOV (QuickTime File Format) is a multimedia container format developed by Apple for storing video, audio, and other time-based media. It supports various codecs and is widely used for multimedia content on Apple platforms, and also in professional video editing. Use the `MOVSinkSettings` class to set the parameters. #### Block info Name: MOVSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/AAC | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-divx | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-h265 | | | | video/x-dv | | | | video/x-huffyuv | | | | image/png | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AACEncoderBlock; UniversalSourceBlock-->H264EncoderBlock; AACEncoderBlock-->MOVSinkBlock; H264EncoderBlock-->MOVSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new MOVSinkBlock(new MOVSinkSettings(@"output.mov")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### MP4 MP4 (MPEG-4 Part 14) is a digital multimedia container format used to store video, audio, and other data such as subtitles and images. It's widely used for sharing video content online and is compatible with a wide range of devices and platforms. Use the `MP4SinkSettings` class to set the parameters. #### Block info Name: MP4SinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/AAC | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-divx | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-h265 | | | | video/x-dv | | | | video/x-huffyuv | | | | image/png | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AACEncoderBlock; UniversalSourceBlock-->H264EncoderBlock; AACEncoderBlock-->MP4SinkBlock; H264EncoderBlock-->MP4SinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### MPEG-PS MPEG-PS (MPEG Program Stream) is a container format for multiplexing digital audio, video, and other data. It is designed for reasonably reliable media, such as DVDs, CD-ROMs, and other disc media. Use the `MPEGPSSinkSettings` class to set the parameters. #### Block info Name: MPEGPSSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP2EncoderBlock; UniversalSourceBlock-->MPEG2EncoderBlock; MP2EncoderBlock-->MPEGPSSinkBlock; MPEG2EncoderBlock-->MPEGPSSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new MP2EncoderBlock(new MP2EncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new MPEG2EncoderBlock(new MPEG2EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new MPEGPSSinkBlock(new MPEGPSSinkSettings(@"output.mpg")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### MPEG-TS MPEG-TS (MPEG Transport Stream) is a standard digital container format for transmission and storage of audio, video, and Program and System Information Protocol (PSIP) data. It is used in broadcast systems such as DVB, ATSC and IPTV. Use the `MPEGTSSinkSettings` class to set the parameters. #### Block info Name: MPEGTSSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/AAC | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-h265 | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AACEncoderBlock; UniversalSourceBlock-->H264EncoderBlock; AACEncoderBlock-->MPEGTSSinkBlock; H264EncoderBlock-->MPEGTSSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new MPEGTSSinkBlock(new MPEGTSSinkSettings(@"output.ts")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### MXF MXF (Material Exchange Format) is a container format for professional digital video and audio media, developed to address issues such as file exchange, interoperability, and to improve project workflow between production houses and content/equipment providers. Use the `MXFSinkSettings` class to set the parameters. #### Block info Name: MXFSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/AAC | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-divx | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-h265 | | | | video/x-dv | | | | image/png | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->PCMEncoderBlock; UniversalSourceBlock-->DIVXEncoderBlock; PCMEncoderBlock-->MXFSinkBlock; DIVXEncoderBlock-->MXFSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioBlock = new PCMEncoderBlock(new PCMEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, audioBlock.Input); var videoEncoderBlock = new DIVXEncoderBlock(new DIVXEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new MXFSinkBlock(new MXFSinkSettings(@"output.mxf")); pipeline.Connect(audioBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### OGG OGG is a free, open container format designed for efficient streaming and manipulation of high quality digital multimedia. It is developed by the Xiph.Org Foundation and supports audio codecs like Vorbis, Opus, and FLAC, and video codecs like Theora. Use the `OGGSinkSettings` class to set the parameters. #### Block info Name: OGGSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/x-vorbis | | | | audio/x-flac | | | | audio/x-speex | | | | audio/x-celt | | | | audio/x-opus | | | Input video | video/x-raw | one or more | | | video/x-theora | | | | video/x-dirac | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VorbisEncoderBlock; UniversalSourceBlock-->TheoraEncoderBlock; VorbisEncoderBlock-->OGGSinkBlock; TheoraEncoderBlock-->OGGSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new TheoraEncoderBlock(new TheoraEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new OGGSinkBlock(new OGGSinkSettings(@"output.ogg")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### WAV WAV (Waveform Audio File Format) is an audio file format standard developed by IBM and Microsoft for storing audio bitstreams on PCs. It is the main format used on Windows systems for raw and typically uncompressed audio. Use the `WAVSinkSettings` class to set the parameters. #### Block info Name: WAVSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one | | | audio/x-alaw | | | | audio/x-mulaw | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->PCMEncoderBlock; PCMEncoderBlock-->WAVSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp3"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioBlock = new PCMEncoderBlock(new PCMEncoderSettings()); pipeline.Connect(fileSource.AudioOutput, audioBlock.Input); var sinkBlock = new WAVSinkBlock(new WAVSinkSettings(@"output.wav")); pipeline.Connect(audioBlock.Output, sinkBlock.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### WebM WebM is an open, royalty-free, media file format designed for the web. WebM defines the file container structure, video and audio formats. WebM files consist of video streams compressed with the VP8 or VP9 video codecs and audio streams compressed with the Vorbis or Opus audio codecs. Use the `WebMSinkSettings` class to set the parameters. #### Block info Name: WebMSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/x-vorbis | | | | audio/x-opus | | | Input video | video/x-raw | one or more | | | video/x-vp8 | | | | video/x-vp9 | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VorbisEncoderBlock; UniversalSourceBlock-->VP9EncoderBlock; VorbisEncoderBlock-->WebMSinkBlock; VP9EncoderBlock-->WebMSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new VorbisEncoderBlock(new VorbisEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new VP9EncoderBlock(new VP9EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new WebMSinkBlock(new WebMSinkSettings(@"output.webm")); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Network Streaming Sinks ### RTMP `RTMP (Real-Time Messaging Protocol)`: Developed by Adobe, RTMP is a protocol used for streaming audio, video, and data over the Internet, optimized for high-performance transmission. It enables efficient, low-latency communication, commonly used in live broadcasting like sports events and concerts. Use the `RTMPSinkSettings` class to set the parameters. #### Block info Name: RTMPSinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input audio | audio/mpeg [1,2,4] | one | | | audio/x-adpcm | | | PCM [U8, S16LE] | | | | audio/x-speex | | | | audio/x-mulaw | | | | audio/x-alaw | | | | audio/x-nellymoser | | | Input video | video/x-h264 | one | #### The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; VirtualAudioSourceBlock-->AACEncoderBlock; H264EncoderBlock-->RTMPSinkBlock; AACEncoderBlock-->RTMPSinkBlock; ``` #### Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // H264/AAC encoders var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); var aacEncoder = new AACEncoderBlock(); pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // RTMP sink var sink = new RTMPSinkBlock(new RTMPSinkSettings()); pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Facebook Live Facebook Live is a feature that allows live streaming of video on Facebook. The livestream can be published to personal profiles, pages, or groups. Use the `FacebookLiveSinkSettings` class to set the parameters. #### Block info Name: FacebookLiveSinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input audio | audio/mpeg [1,2,4] | one | | | audio/x-adpcm | | | PCM [U8, S16LE] | | | | audio/x-speex | | | | audio/x-mulaw | | | | audio/x-alaw | | | | audio/x-nellymoser | | | Input video | video/x-h264 | one | #### The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; VirtualAudioSourceBlock-->AACEncoderBlock; H264EncoderBlock-->FacebookLiveSinkBlock; AACEncoderBlock-->FacebookLiveSinkBlock; ``` #### Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_25, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // H264/AAC encoders var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); var aacEncoder = new AACEncoderBlock(); pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // Facebook Live sink var sink = new FacebookLiveSinkBlock(new FacebookLiveSinkSettings( "https://facebook.com/rtmp/...", "your_stream_key")); pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### HLS HLS (HTTP Live Streaming) is an HTTP-based adaptive streaming communications protocol developed by Apple. It enables adaptive bitrate streaming by breaking the stream into a sequence of small HTTP-based file segments, typically using MPEG-TS fragments as the container. Use the `HLSSinkSettings` class to set the parameters. #### Block info Name: HLSSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/mpeg | one or more | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/AAC | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-h265 | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AACEncoderBlock; UniversalSourceBlock-->H264EncoderBlock1; UniversalSourceBlock-->H264EncoderBlock2; UniversalSourceBlock-->H264EncoderBlock3; AACEncoderBlock-->HLSSinkBlock; H264EncoderBlock1-->HLSSinkBlock; H264EncoderBlock2-->HLSSinkBlock; H264EncoderBlock3-->HLSSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); // 3 video encoders with different bitrates for adaptive streaming var videoEncoderBlock1 = new H264EncoderBlock(new OpenH264EncoderSettings { Bitrate = 3000, Width = 1920, Height = 1080 }); var videoEncoderBlock2 = new H264EncoderBlock(new OpenH264EncoderSettings { Bitrate = 1500, Width = 1280, Height = 720 }); var videoEncoderBlock3 = new H264EncoderBlock(new OpenH264EncoderSettings { Bitrate = 800, Width = 854, Height = 480 }); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock1.Input); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock2.Input); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock3.Input); // Configure HLS sink var hlsSettings = new HLSSinkSettings("./output/") { PlaylistName = "playlist.m3u8", SegmentDuration = 6, PlaylistType = HLSPlaylistType.Event, HTTPServerEnabled = true, HTTPServerPort = 8080 }; var sinkBlock = new HLSSinkBlock(hlsSettings); // Connect audio pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); // Connect video variants pipeline.Connect(videoEncoderBlock1.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video, "1080p")); pipeline.Connect(videoEncoderBlock2.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video, "720p")); pipeline.Connect(videoEncoderBlock3.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video, "480p")); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### MJPEG over HTTP HTTP MJPEG (Motion JPEG) Live is a video streaming format where each video frame is compressed separately as a JPEG image and transmitted over HTTP. It is widely used in IP cameras and webcams due to its simplicity, although it is less efficient than modern codecs. Use the `HTTPMJPEGLiveSinkSettings` class to set the parameters. #### Block info Name: HTTPMJPEGLiveSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | video/x-raw | one | | | image/jpeg | | #### The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->MJPEGEncoderBlock; MJPEGEncoderBlock-->HTTPMJPEGLiveSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create virtual video source var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1280, Height = 720, FrameRate = VideoFrameRate.FPS_30, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); // MJPEG encoder var mjpegEncoder = new MJPEGEncoderBlock(new MJPEGEncoderSettings { Quality = 80 }); pipeline.Connect(videoSource.Output, mjpegEncoder.Input); // HTTP MJPEG server var sink = new HTTPMJPEGLiveSinkBlock(new HTTPMJPEGLiveSinkSettings { Port = 8080, Path = "/stream" }); pipeline.Connect(mjpegEncoder.Output, sink.Input); // Start await pipeline.StartAsync(); Console.WriteLine("MJPEG stream available at http://localhost:8080/stream"); Console.WriteLine("Press any key to stop..."); Console.ReadKey(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ### NDI NDI (Network Device Interface) is a royalty-free video transport standard developed by NewTek that enables video-compatible products to communicate, deliver, and receive broadcast-quality video in a high-quality, low-latency manner over standard Ethernet networks. Use the `NDISinkSettings` class to set the parameters. #### Block info Name: NDISinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one | | Input video | video/x-raw | one | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->NDISinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var sinkBlock = new NDISinkBlock(new NDISinkSettings("My NDI Stream")); pipeline.Connect(fileSource.AudioOutput, sinkBlock.AudioInput); pipeline.Connect(fileSource.VideoOutput, sinkBlock.VideoInput); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux. ### SRT SRT (Secure Reliable Transport) is an open source video transport protocol that enables the delivery of high-quality, secure, low-latency video across unpredictable networks like the public internet. It was developed by Haivision. Use the `SRTSinkSettings` class to set the parameters. #### Block info Name: SRTSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input | Any stream format | 1 | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MP4MuxerBlock; MP4MuxerBlock-->SRTSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Create a multiplexer block to combine audio and video var muxer = new MP4MuxerBlock(); pipeline.Connect(fileSource.AudioOutput, muxer.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(fileSource.VideoOutput, muxer.CreateNewInput(MediaBlockPadMediaType.Video)); // Create SRT sink in caller mode (connecting to a listener) var srtSettings = new SRTSinkSettings { Host = "srt-server.example.com", Port = 1234, Mode = SRTMode.Caller, Latency = 200, // milliseconds Passphrase = "optional-encryption-passphrase" }; var srtSink = new SRTSinkBlock(srtSettings); pipeline.Connect(muxer.Output, srtSink.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### SRT MPEG-TS SRT MPEG-TS is a combination of the SRT transport protocol with MPEG-TS container format. This allows secure, reliable transport of MPEG-TS streams over public networks, which is useful for broadcast and professional video workflows. Use the `SRTMPEGTSSinkSettings` class to set the parameters. #### Block info Name: SRTMPEGTSSinkBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input audio | audio/x-raw | one or more | | | audio/mpeg | | | | audio/x-ac3 | | | | audio/x-alaw | | | | audio/x-mulaw | | | | audio/AAC | | | Input video | video/x-raw | one or more | | | image/jpeg | | | | video/x-msmpeg | | | | video/mpeg | | | | video/x-h263 | | | | video/x-h264 | | | | video/x-h265 | | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AACEncoderBlock; UniversalSourceBlock-->H264EncoderBlock; AACEncoderBlock-->SRTMPEGTSSinkBlock; H264EncoderBlock-->SRTMPEGTSSinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var audioEncoderBlock = new AACEncoderBlock(new AACEncoderSettings() { Bitrate = 192 }); pipeline.Connect(fileSource.AudioOutput, audioEncoderBlock.Input); var videoEncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); // Configure SRT MPEG-TS sink var srtMpegtsSinkSettings = new SRTMPEGTSSinkSettings { Host = "srt-server.example.com", Port = 1234, Mode = SRTMode.Caller, Latency = 200, Passphrase = "optional-encryption-passphrase" }; var sinkBlock = new SRTMPEGTSSinkBlock(srtMpegtsSinkSettings); pipeline.Connect(audioEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Audio)); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### YouTube Live YouTube Live is a live streaming service provided by YouTube. It allows creators to broadcast live videos to their audience through the YouTube platform. Use the `YouTubeSinkSettings` class to set the parameters. #### Block info Name: YouTubeSinkBlock. | Pin direction | Media type | Pins count | | --- |:------------:|:-----------:| | Input audio | audio/mpeg [1,2,4] | one | | | audio/x-adpcm | | | PCM [U8, S16LE] | | | | audio/x-speex | | | | audio/x-mulaw | | | | audio/x-alaw | | | | audio/x-nellymoser | | | Input video | video/x-h264 | one | #### The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->H264EncoderBlock; VirtualAudioSourceBlock-->AACEncoderBlock; H264EncoderBlock-->YouTubeSinkBlock; AACEncoderBlock-->YouTubeSinkBlock; ``` #### Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(); // video and audio sources var virtualVideoSource = new VirtualVideoSourceSettings { Width = 1920, Height = 1080, FrameRate = VideoFrameRate.FPS_30, }; var videoSource = new VirtualVideoSourceBlock(virtualVideoSource); var virtualAudioSource = new VirtualAudioSourceSettings { Channels = 2, SampleRate = 48000, }; var audioSource = new VirtualAudioSourceBlock(virtualAudioSource); // H264/AAC encoders var h264Settings = new OpenH264EncoderSettings { Bitrate = 4000, // 4 Mbps for 1080p KeyframeInterval = 2 // Keyframe every 2 seconds }; var h264Encoder = new H264EncoderBlock(h264Settings); var aacSettings = new AACEncoderSettings { Bitrate = 192 // 192 kbps for audio }; var aacEncoder = new AACEncoderBlock(aacSettings); pipeline.Connect(videoSource.Output, h264Encoder.Input); pipeline.Connect(audioSource.Output, aacEncoder.Input); // YouTube Live sink var sink = new YouTubeSinkBlock(new YouTubeSinkSettings( "rtmp://a.rtmp.youtube.com/live2/", "your_youtube_stream_key")); pipeline.Connect(h264Encoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Video)); pipeline.Connect(aacEncoder.Output, sink.CreateNewInput(MediaBlockPadMediaType.Audio)); // Start await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Shoutcast `Shoutcast` is a service for streaming media over the internet to media players, using its own cross-platform proprietary software. It allows digital audio content, primarily in MP3 or High-Efficiency Advanced Audio Coding (HE-AAC) format, to be broadcast. The most common use of Shoutcast is for creating or listening to Internet audio broadcasts. Use the `ShoutcastSinkSettings` class to set the parameters. #### Block info Name: ShoutcastSinkBlock. | Pin direction | Media type | Pins count | | ------------- | :----------------: | :--------: | | Input audio | audio/mpeg | one | | | audio/aac | | | | audio/x-aac | | #### The sample pipeline ```mermaid graph LR; subgraph MainPipeline direction LR A[Audio Source e.g. UniversalSourceBlock or VirtualAudioSourceBlock] --> B{Optional Audio Encoder e.g. MP3EncoderBlock}; B --> C[ShoutcastSinkBlock]; end subgraph AlternativeIfSourceEncoded A2[Encoded Audio Source] --> C2[ShoutcastSinkBlock]; end ``` #### Sample code ```csharp // Pipeline var pipeline = new MediaBlocksPipeline(); // Audio source (e.g., from a file with MP3/AAC or raw audio) var universalSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri("input.mp3"))); // Or use VirtualAudioSourceBlock for live raw audio input: // var audioSource = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings { Channels = 2, SampleRate = 44100 }); // Optional: Audio Encoder (if source is raw audio or needs re-encoding for Shoutcast) // Example: MP3EncoderBlock if Shoutcast server expects MP3 var mp3Encoder = new MP3EncoderBlock(new MP3EncoderSettings() { Bitrate = 128000 }); // Bitrate in bps pipeline.Connect(universalSource.AudioOutput, mp3Encoder.Input); // If using VirtualAudioSourceBlock: pipeline.Connect(audioSource.Output, mp3Encoder.Input); // Shoutcast sink // Configure the Shoutcast/Icecast server connection details var shoutcastSettings = new ShoutcastSinkSettings { IP = "your-shoutcast-server-ip", // Server hostname or IP address Port = 8000, // Server port Mount = "/mountpoint", // Mount point (e.g., "/stream", "/live.mp3") Password = "your-password", // Source password for the server Protocol = ShoutProtocol.ICY, // ShoutProtocol.ICY for Shoutcast v1/v2 (e.g., icy://) // ShoutProtocol.HTTP for Icecast 2.x (e.g., http://) // ShoutProtocol.XAudiocast for older Shoutcast/XAudioCast // Metadata for the stream StreamName = "My Radio Stream", Genre = "Various", Description = "My awesome internet radio station", URL = "http://my-radio-website.com", // Homepage URL for your stream (shows up in directory metadata) Public = true, // Set to true to list on public directories (if server supports) Username = "source" // Username for authentication (often "source"; check server config) // Other stream parameters like audio bitrate, samplerate, channels are typically determined // by the properties of the encoded input audio stream fed to the ShoutcastSinkBlock. }; var shoutcastSink = new ShoutcastSinkBlock(shoutcastSettings); // Connect encoder's output (or source's audio output if already encoded and compatible) to Shoutcast sink pipeline.Connect(mp3Encoder.Output, shoutcastSink.Input); // If source is already encoded and compatible (e.g. MP3 file to MP3 Shoutcast): // pipeline.Connect(universalSource.AudioOutput, shoutcastSink.Input); // Start the pipeline await pipeline.StartAsync(); // For display purposes, you can construct a string representing the connection: string protocolScheme = shoutcastSettings.Protocol switch { ShoutProtocol.ICY => "icy", ShoutProtocol.HTTP => "http", ShoutProtocol.XAudiocast => "xaudiocast", // Note: actual scheme might be http for XAudiocast _ => "unknown" }; Console.WriteLine($"Streaming to Shoutcast server: {protocolScheme}://{shoutcastSettings.IP}:{shoutcastSettings.Port}{shoutcastSettings.Mount}"); Console.WriteLine($"Stream metadata URL (for directories): {shoutcastSettings.URL}"); Console.WriteLine("Press any key to stop the stream..."); Console.ReadKey(); // Stop the pipeline (important for graceful disconnection and resource cleanup) await pipeline.StopAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Sources\index.md --- title: .Net Media Source Blocks Guide description: Explore a complete guide to .Net Media SDK source blocks. Learn about hardware, file, network, and virtual sources for your media processing pipelines. sidebar_label: Sources --- # Source Blocks - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Source blocks provide data to the pipeline and are typically the first blocks in any media processing chain. VisioForge Media Blocks SDK .Net provides a comprehensive collection of source blocks for various inputs including hardware devices, files, networks, and virtual sources. ## Hardware Source Blocks ### System Video Source SystemVideoSourceBlock is used to access webcams and other video capture devices. #### Block info Name: SystemVideoSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | uncompressed video | 1 | #### Enumerate available devices Use the `DeviceEnumerator.Shared.VideoSourcesAsync()` method to get a list of available devices and their specifications: available resolutions, frame rates, and video formats. This method returns a list of `VideoCaptureDeviceInfo` objects. Each `VideoCaptureDeviceInfo` object provides detailed information about a capture device. #### The sample pipeline ```mermaid graph LR; SystemVideoSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create video source VideoCaptureDeviceSourceSettings videoSourceSettings = null; // select the first device var device = (await DeviceEnumerator.Shared.VideoSourcesAsync())[0]; if (device != null) { // select the first format (maybe not the best, but it is just a sample) var formatItem = device.VideoFormats[0]; if (formatItem != null) { videoSourceSettings = new VideoCaptureDeviceSourceSettings(device) { Format = formatItem.ToFormat() }; // select the first frame rate videoSourceSettings.Format.FrameRate = formatItem.FrameRateList[0]; } } // create video source block using the selected device and format var videoSource = new SystemVideoSourceBlock(videoSourceSettings); // create video renderer block var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // connect blocks pipeline.Connect(videoSource.Output, videoRenderer.Input); // start pipeline await pipeline.StartAsync(); ``` #### Sample applications - [Simple Video Capture Demo (WPF)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) #### Remarks You can specify an API to use during the device enumeration (refer to the `VideoCaptureDeviceAPI` enum description under `SystemVideoSourceBlock` for typical values). Android and iOS platforms have only one API, while Windows and Linux have multiple APIs. #### Platforms Windows, macOS, Linux, iOS, Android. ### System Audio Source SystemAudioSourceBlock is used to access mics and other audio capture devices. #### Block info Name: SystemAudioSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | uncompressed audio | 1 | #### Enumerate available devices Use the `DeviceEnumerator.Shared.AudioSourcesAsync()` method call to get a list of available devices and their specifications. During device enumeration, you can get the list of available devices and their specifications. You can select the device and its format to create the source settings. #### The sample pipeline ```mermaid graph LR; SystemAudioSourceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source block IAudioCaptureDeviceSourceSettings audioSourceSettings = null; // select first device var device = (await DeviceEnumerator.Shared.AudioSourcesAsync())[0]; if (device != null) { // select first format var formatItem = device.Formats[0]; if (formatItem != null) { audioSourceSettings = device.CreateSourceSettings(formatItem.ToFormat()); } } // create audio source block using selected device and format var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // create audio renderer block var audioRenderer = new AudioRendererBlock(); // connect blocks pipeline.Connect(audioSource.Output, audioRenderer.Input); // start pipeline await pipeline.StartAsync(); ``` #### Capture audio from speakers (loopback) Currently, loopback audio capture is supported only on Windows. Use the `LoopbackAudioCaptureDeviceSourceSettings` class to create the source settings for loopback audio capture. WASAPI2 is used as the default API for loopback audio capture. You can specify the API to use during device enumeration. ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create audio source block var deviceItem = (await DeviceEnumerator.Shared.AudioOutputsAsync(AudioOutputDeviceAPI.WASAPI2))[0]; if (deviceItem == null) { return; } var audioSourceSettings = new LoopbackAudioCaptureDeviceSourceSettings(deviceItem); var audioSource = new SystemAudioSourceBlock(audioSourceSettings); // create audio renderer block var audioRenderer = new AudioRendererBlock(); // connect blocks pipeline.Connect(audioSource.Output, audioRenderer.Input); // start pipeline await pipeline.StartAsync(); ``` #### Sample applications - [Audio Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Audio%20Capture%20Demo) - [Simple Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) #### Remarks You can specify an API to use during the device enumeration. Android and iOS platforms have only one API, while Windows and Linux have multiple APIs. #### Platforms Windows, macOS, Linux, iOS, Android. ### Basler Source Block The Basler source block supports Basler USB3 Vision and GigE cameras. The Pylon SDK or Runtime should be installed to use the camera source. #### Block info Name: BaslerSourceBlock. | Pin direction | Media type | Pins count | |-----------------|:--------------------:|:-----------:| | Output video | Uncompressed | 1 | #### The sample pipeline ```mermaid graph LR; BaslerSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // get Basler source info by enumerating sources var sources = await DeviceEnumerator.Shared.BaslerSourcesAsync(); var sourceInfo = sources[0]; // create Basler source var source = new BaslerSourceBlock(new BaslerSourceSettings(sourceInfo)); // create video renderer for VideoView var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // connect pipeline.Connect(source.Output, videoRenderer.Input); // start await pipeline.StartAsync(); ``` #### Sample applications - [Basler Source Demo (WPF)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Basler%20Source%20Demo) #### Platforms Windows, Linux. ### Spinnaker/FLIR Source Block The Spinnaker/FLIR source supports connection to FLIR cameras using Spinnaker SDK. To use the `SpinnakerSourceBlock`, you first need to enumerate available Spinnaker cameras and then configure the source using `SpinnakerSourceSettings`. #### Enumerate Devices & `SpinnakerCameraInfo` Use `DeviceEnumerator.Shared.SpinnakerSourcesAsync()` to get a list of `SpinnakerCameraInfo` objects. Each `SpinnakerCameraInfo` provides details about a detected camera: - `Name` (string): Unique identifier or name of the camera. Often a serial number or model-serial combination. - `NetworkInterfaceName` (string): Name of the network interface if it's a GigE camera. - `Vendor` (string): Camera vendor name. - `Model` (string): Camera model name. - `SerialNumber` (string): Camera's serial number. - `FirmwareVersion` (string): Camera's firmware version. - `SensorSize` (`Size`): Reports the sensor dimensions (Width, Height). You might need to call a method on `SpinnakerCameraInfo` like `ReadInfo()` (if available, or implied by enumeration) to populate this. - `WidthMax` (int): Maximum sensor width. - `HeightMax` (int): Maximum sensor height. You select a `SpinnakerCameraInfo` object from the list to initialize `SpinnakerSourceSettings`. #### Settings The `SpinnakerSourceBlock` is configured using `SpinnakerSourceSettings`. Key properties: - `Name` (string): The name of the camera (from `SpinnakerCameraInfo.Name`) to use. - `Region` (`Rect`): Defines the Region of Interest (ROI) to capture from the camera sensor. Set X, Y, Width, Height. - `FrameRate` (`VideoFrameRate`): The desired frame rate. - `PixelFormat` (`SpinnakerPixelFormat` enum): The desired pixel format (e.g., `RGB`, `Mono8`, `BayerRG8`). Default `RGB`. - `OffsetX` (int): X offset for the ROI on the sensor (default 0). Often implicitly part of `Region.X`. - `OffsetY` (int): Y offset for the ROI on the sensor (default 0). Often implicitly part of `Region.Y`. - `ExposureMinimum` (int): Minimum exposure time for auto-exposure algorithm (microseconds, e.g., 10-29999999). Default 0 (auto/camera default). - `ExposureMaximum` (int): Maximum exposure time for auto-exposure algorithm (microseconds). Default 0 (auto/camera default). - `ShutterType` (`SpinnakerSourceShutterType` enum): Type of shutter (e.g., `Rolling`, `Global`). Default `Rolling`. Constructor: `SpinnakerSourceSettings(string deviceName, Rect region, VideoFrameRate frameRate, SpinnakerPixelFormat pixelFormat = SpinnakerPixelFormat.RGB)` #### Block info Name: SpinnakerSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | various | one or more | #### The sample pipeline `SpinnakerSourceBlock:Output` → `VideoRendererBlock` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var sources = await DeviceEnumerator.Shared.SpinnakerSourcesAsync(); var sourceSettings = new SpinnakerSourceSettings(sources[0].Name, new VisioForge.Core.Types.Rect(0, 0, 1280, 720), new VideoFrameRate(10)); var source = new SpinnakerSourceBlock(sourceSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(source.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Requirements - Spinnaker SDK installed. #### Platforms Windows ### Allied Vision Source Block The Allied Vision Source Block enables integration with Allied Vision cameras using the Vimba SDK. It allows capturing video streams from these industrial cameras. #### Block info Name: AlliedVisionSourceBlock. | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; AlliedVisionSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Enumerate Allied Vision cameras var alliedVisionCameras = await DeviceEnumerator.Shared.AlliedVisionSourcesAsync(); if (alliedVisionCameras.Count == 0) { Console.WriteLine("No Allied Vision cameras found."); return; } var cameraInfo = alliedVisionCameras[0]; // Select the first camera // Create Allied Vision source settings // Width, height, x, y are optional and depend on whether you want to set a specific ROI // If null, it might use default/full sensor resolution. Camera.ReadInfo() should be called. cameraInfo.ReadInfo(); // Ensure camera info like Width/Height is read var alliedVisionSettings = new AlliedVisionSourceSettings( cameraInfo, width: cameraInfo.Width, // Or a specific ROI width height: cameraInfo.Height // Or a specific ROI height ); // Optionally configure other settings alliedVisionSettings.ExposureAuto = VmbSrcExposureAutoModes.Continuous; alliedVisionSettings.Gain = 10; // Example gain value var alliedVisionSource = new AlliedVisionSourceBlock(alliedVisionSettings); // Create video renderer var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control // Connect blocks pipeline.Connect(alliedVisionSource.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Requirements - Allied Vision Vimba SDK must be installed. #### Sample applications - Refer to samples demonstrating industrial camera integration if available. #### Platforms Windows, macOS, Linux. ### Blackmagic Decklink Source Block For information about Decklink sources, see [Decklink](../Decklink/index.md). ## File Source Blocks ### Universal Source Block A universal source that decodes video and audio files/network streams and provides uncompressed data to the connected blocks. Block supports MP4, WebM, AVI, TS, MKV, MP3, AAC, M4A, and many other formats. If FFMPEG redist is available, all decoders available in FFMPEG will also be supported. #### Settings The `UniversalSourceBlock` is configured through `UniversalSourceSettings`. It's recommended to create settings using the static factory method `await UniversalSourceSettings.CreateAsync(...)`. Key properties and parameters for `UniversalSourceSettings`: - **URI/Filename**: - `UniversalSourceSettings.CreateAsync(string filename, bool renderVideo = true, bool renderAudio = true, bool renderSubtitle = false)`: Creates settings from a local file path. - `UniversalSourceSettings.CreateAsync(System.Uri uri, bool renderVideo = true, bool renderAudio = true, bool renderSubtitle = false)`: Creates settings from a `System.Uri` (can be a file URI or network URI like HTTP, RTSP - though dedicated blocks are often preferred for network streams). For iOS, an `Foundation.NSUrl` is used. - The `renderVideo`, `renderAudio`, `renderSubtitle` booleans control which streams are processed. The `CreateAsync` method may update these based on actual stream availability in the media file/stream if `ignoreMediaInfoReader` is `false` (default). - `StartPosition` (`TimeSpan?`): Sets the starting position for playback. - `StopPosition` (`TimeSpan?`): Sets the stopping position for playback. - `VideoCustomFrameRate` (`VideoFrameRate?`): If set, video frames will be dropped or duplicated to match this custom frame rate. - `UseAdvancedEngine` (bool): If `true` (default, except Android where it's `false`), uses an advanced engine with stream selection support. - `DisableHWDecoders` (bool): If `true` (default `false`, except Android where it's `true`), hardware-accelerated decoders will be disabled, forcing software decoding. - `MPEGTSProgramNumber` (int): For MPEG-TS streams, specifies the program number to select (default -1, meaning automatic selection or first program). - `ReadInfoAsync()`: Asynchronously reads media file information (`MediaFileInfo`). This is called internally by `CreateAsync` unless `ignoreMediaInfoReader` is true. - `GetInfo()`: Gets the cached `MediaFileInfo`. The `UniversalSourceBlock` itself is then instantiated with these settings: `new UniversalSourceBlock(settings)`. The `Filename` property on `UniversalSourceBlock` instance (as seen in older examples) is a shortcut that internally creates basic `UniversalSourceSettings`. Using `UniversalSourceSettings.CreateAsync` provides more control. #### Block info Name: UniversalSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | depends from decoder | one or more | | Output video | depends from decoder | one or more | | Output subtitle | depends from decoder | one or more | #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoRendererBlock; UniversalSourceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var fileSource = new UniversalSourceBlock(); fileSource.Filename = "test.mp4"; var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(fileSource.VideoOutput, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(fileSource.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Sample applications - [Simple Player Demo (WPF)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Player%20Demo%20WPF) #### Platforms Windows, macOS, Linux, iOS, Android. ### Subtitle Source Block The Subtitle Source Block loads subtitles from a file and outputs them as a subtitle stream, which can then be overlaid on video or rendered separately. #### Block info Name: `SubtitleSourceBlock`. | Pin direction | Media type | Pins count | |-----------------|:--------------------:|:-----------:| | Output subtitle | Subtitle data | 1 | #### Settings The `SubtitleSourceBlock` is configured using `SubtitleSourceSettings`. Key properties include: - `Filename` (string): The path to the subtitle file (e.g., .srt, .ass). #### The sample pipeline ```mermaid graph LR; UniversalSourceBlock --> SubtitleOverlayBlock; SubtitleSourceBlock --> SubtitleOverlayBlock; SubtitleOverlayBlock --> VideoRendererBlock; UniversalSourceBlock --> AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create subtitle source settings var subtitleSettings = new SubtitleSourceSettings("path/to/your/subtitles.srt"); var subtitleSource = new SubtitleSourceBlock(subtitleSettings); // Example: Overlaying subtitles on a video from UniversalSourceBlock var fileSource = await UniversalSourceSettings.CreateAsync("path/to/your/video.mp4"); var universalSource = new UniversalSourceBlock(fileSource); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); var audioRenderer = new AudioRendererBlock(); // This is a conceptual overlay. Actual implementation might need a specific subtitle overlay block. // For simplicity, let's assume a downstream block can consume a subtitle stream, // or you connect it to a block that renders subtitles on the video. // Example with a hypothetical SubtitleOverlayBlock: // var subtitleOverlay = new SubtitleOverlayBlock(); // Assuming such a block exists // pipeline.Connect(universalSource.VideoOutput, subtitleOverlay.VideoInput); // pipeline.Connect(subtitleSource.Output, subtitleOverlay.SubtitleInput); // pipeline.Connect(subtitleOverlay.Output, videoRenderer.Input); // pipeline.Connect(universalSource.AudioOutput, audioRenderer.Input); // For a simple player without explicit overlay shown here: pipeline.Connect(universalSource.VideoOutput, videoRenderer.Input); pipeline.Connect(universalSource.AudioOutput, audioRenderer.Input); // How subtitles from subtitleSource.Output are used would depend on the rest of the pipeline design. // This block primarily provides the subtitle stream. Console.WriteLine("Subtitle source created. Connect its output to a compatible block like a subtitle overlay or renderer."); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android (Depends on subtitle parsing capabilities). ### Stream Source Block The Stream Source Block allows reading media data from a `System.IO.Stream`. This is useful for playing media from memory, embedded resources, or custom stream providers without needing a temporary file. The format of the data within the stream must be parsable by the underlying media framework (GStreamer). #### Block info Name: `StreamSourceBlock`. (Pin information is dynamic, similar to `UniversalSourceBlock`, based on stream content. Typically, it would have an output that connects to a demuxer/decoder like `DecodeBinBlock`, or provide decoded audio/video pins if it includes demuxing/decoding capabilities.) | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output data | Varies (raw stream)| 1 | _Alternatively, if it decodes:_ | Output video | Depends on stream | 0 or 1 | | Output audio | Depends on stream | 0 or 1+ | #### Settings The `StreamSourceBlock` is typically instantiated directly with a `System.IO.Stream`. The `StreamSourceSettings` class serves as a wrapper to provide this stream. - `Stream` (`System.IO.Stream`): The input stream containing the media data. The stream must be readable and, if seeking is required by the pipeline, seekable. #### The sample pipeline If `StreamSourceBlock` outputs raw data that needs decoding: ```mermaid graph LR; StreamSourceBlock -- Stream Data --> DecodeBinBlock; DecodeBinBlock -- Video Output --> VideoRendererBlock; DecodeBinBlock -- Audio Output --> AudioRendererBlock; ``` If `StreamSourceBlock` handles decoding internally (less common for a generic stream source): ```mermaid graph LR; StreamSourceBlock -- Video Output --> VideoRendererBlock; StreamSourceBlock -- Audio Output --> AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Example: Load a video file into a MemoryStream byte[] fileBytes = File.ReadAllBytes("path/to/your/video.mp4"); var memoryStream = new MemoryStream(fileBytes); // StreamSourceSettings is a container for the stream. var streamSettings = new StreamSourceSettings(memoryStream); // The CreateBlock method of StreamSourceSettings would typically return new StreamSourceBlock(streamSettings.Stream) var streamSource = streamSettings.CreateBlock() as StreamSourceBlock; // Or, more directly: var streamSource = new StreamSourceBlock(memoryStream); // Create video and audio renderers var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 var audioRenderer = new AudioRendererBlock(); // Connect outputs. Commonly, a StreamSourceBlock provides raw data to a DecodeBinBlock. var decodeBin = new DecodeBinBlock(); pipeline.Connect(streamSource.Output, decodeBin.Input); // Assuming a single 'Output' pin on StreamSourceBlock pipeline.Connect(decodeBin.VideoOutput, videoRenderer.Input); pipeline.Connect(decodeBin.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); // Important: Ensure the stream remains open and valid for the duration of playback. // Dispose of the stream when the pipeline is stopped or disposed. // Consider this in relation to pipeline.DisposeAsync() or similar cleanup. // memoryStream.Dispose(); // Typically after pipeline.StopAsync() and pipeline.DisposeAsync() ``` #### Remarks The `StreamSourceBlock` itself will attempt to read from the provided stream. The success of playback depends on the format of the data in the stream and the availability of appropriate demuxers and decoders in the subsequent parts of the pipeline (often managed via `DecodeBinBlock`). #### Platforms Windows, macOS, Linux, iOS, Android. ### CDG Source Block The CDG Source Block is designed to play CD+G (Compact Disc + Graphics) files, commonly used for karaoke. It decodes both the audio track and the low-resolution graphics stream. #### Block info Name: CDGSourceBlock. | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output audio | Uncompressed audio | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; CDGSourceBlock -- Audio --> AudioRendererBlock; CDGSourceBlock -- Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create CDG source settings var cdgSettings = new CDGSourceSettings( "path/to/your/file.cdg", // Path to the CDG graphics file "path/to/your/file.mp3" // Path to the corresponding audio file (MP3, WAV, etc.) ); // If audioFilename is null or empty, audio will be ignored. var cdgSource = new CDGSourceBlock(cdgSettings); // Create video renderer var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control pipeline.Connect(cdgSource.VideoOutput, videoRenderer.Input); // Create audio renderer (if audio is to be played) if (!string.IsNullOrEmpty(cdgSettings.AudioFilename) && cdgSource.AudioOutput != null) { var audioRenderer = new AudioRendererBlock(); pipeline.Connect(cdgSource.AudioOutput, audioRenderer.Input); } // Start pipeline await pipeline.StartAsync(); ``` #### Remarks Requires both a `.cdg` file for graphics and a separate audio file (e.g., MP3, WAV) for the music. #### Platforms Windows, macOS, Linux, iOS, Android. ## Network Source Blocks ### VNC Source Block The VNC Source Block allows capturing video from a VNC (Virtual Network Computing) or RFB (Remote Framebuffer) server. This is useful for streaming the desktop of a remote machine. #### Block info Name: `VNCSourceBlock`. | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output video | Uncompressed video | 1 | #### Settings The `VNCSourceBlock` is configured using `VNCSourceSettings`. Key properties include: - `Host` (string): The hostname or IP address of the VNC server. - `Port` (int): The port number of the VNC server. - `Password` (string): The password for VNC server authentication, if required. - `Uri` (string): Alternatively, a full RFB URI (e.g., "rfb://host:port"). - `Width` (int): Desired output width. The block may connect to a VNC server that provides specific dimensions. - `Height` (int): Desired output height. - `Shared` (bool): Whether to share the desktop with other clients (default `true`). - `ViewOnly` (bool): If `true`, no input (mouse/keyboard) is sent to the VNC server (default `false`). - `Incremental` (bool): Whether to use incremental updates (default `true`). - `UseCopyrect` (bool): Whether to use copyrect encoding (default `false`). - `RFBVersion` (string): RFB protocol version (default "3.3"). - `OffsetX` (int): X offset for screen scraping. - `OffsetY` (int): Y offset for screen scraping. #### The sample pipeline ```mermaid graph LR; VNCSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Configure VNC source settings var vncSettings = new VNCSourceSettings { Host = "your-vnc-server-ip", // or use Uri Port = 5900, // Standard VNC port Password = "your-password", // if any // Width = 1920, // Optional: desired width // Height = 1080, // Optional: desired height }; var vncSource = new VNCSourceBlock(vncSettings); // Create video renderer var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control // Connect blocks pipeline.Connect(vncSource.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux (Depends on underlying GStreamer VNC plugin availability). ### RTSP Source Block The RTSP source supports connection to IP cameras and other devices supporting the RTSP protocol. Supported video codecs: H264, HEVC, MJPEG. Supported audio codecs: AAC, MP3, PCM, G726, G711, and some others if FFMPEG redist is installed. #### Block info Name: RTSPSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | depends from decoder | one or more | | Output video | depends from decoder | one or more | | Output subtitle | depends from decoder | one or more | #### Settings The `RTSPSourceBlock` is configured using `RTSPSourceSettings`. Key properties include: - `Uri`: The RTSP URL of the stream. - `Login`: Username for RTSP authentication, if required. - `Password`: Password for RTSP authentication, if required. - `AudioEnabled`: A boolean indicating whether to attempt to process the audio stream. - `Latency`: Specifies the buffering duration for the incoming stream (default is 1000ms). - `AllowedProtocols`: Defines the transport protocols to be used for receiving the stream. It's a flags enum `RTSPSourceProtocol` with values: - `UDP`: Stream data over UDP. - `UDP_Multicast`: Stream data over UDP multicast. - `TCP` (Recommended): Stream data over TCP. - `HTTP`: Stream data tunneled over HTTP. - `EnableTLS`: Encrypt TCP and HTTP with TLS (use `rtsps://` or `httpsps://` in URI). - `DoRTCP`: Enables RTCP (RTP Control Protocol) for stream statistics and control (default is usually true). - `RTPBlockSize`: Specifies the size of RTP blocks. - `UDPBufferSize`: Buffer size for UDP transport. - `CustomVideoDecoder`: Allows specifying a custom GStreamer video decoder element name if the default is not suitable. - `UseGPUDecoder`: If set to `true`, the SDK will attempt to use a hardware-accelerated GPU decoder if available. - `CompatibilityMode`: If `true`, the SDK will not try to read camera information before attempting to play, which can be useful for problematic streams. - `EnableRAWVideoAudioEvents`: If `true`, enables events for raw (undecoded) video and audio sample data. It's recommended to initialize `RTSPSourceSettings` using the static factory method `RTSPSourceSettings.CreateAsync(Uri uri, string login, string password, bool audioEnabled, bool readInfo = true)`. This method can also handle ONVIF discovery if the URI points to an ONVIF device service. Setting `readInfo` to `false` enables `CompatibilityMode`. #### The sample pipeline ```mermaid graph LR; RTSPSourceBlock-->VideoRendererBlock; RTSPSourceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // It's recommended to use CreateAsync to initialize settings var rtspSettings = await RTSPSourceSettings.CreateAsync( new Uri("rtsp://login:pwd@192.168.1.64:554/Streaming/Channels/101?transportmode=unicast&profile=Profile_1"), "login", "pwd", audioEnabled: true); // Optionally, configure more settings // rtspSettings.Latency = TimeSpan.FromMilliseconds(500); // rtspSettings.AllowedProtocols = RTSPSourceProtocol.TCP; // Prefer TCP var rtspSource = new RTSPSourceBlock(rtspSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(rtspSource.VideoOutput, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(rtspSource.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Sample applications - [RTSP Preview Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/RTSP%20Preview%20Demo) - [RTSP MultiViewSync Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/RTSP%20MultiViewSync%20Demo) #### Platforms Windows, macOS, Linux, iOS, Android. ### HTTP Source Block The HTTP source block allows data to be retrieved using HTTP/HTTPS protocols. It can be used to read data from MJPEG IP cameras, MP4 network files, or other sources. #### Block info Name: HTTPSourceBlock. | Pin direction | Media type | Pins count | |---------------|:------------:|:-----------:| | Output | Data | 1 | #### The sample pipeline The sample pipeline reads data from an MJPEG camera and displays it using VideoView. ```mermaid graph LR; HTTPSourceBlock-->JPEGDecoderBlock; JPEGDecoderBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var settings = new HTTPSourceSettings(new Uri("http://mjpegcamera:8080")) { UserID = "username", UserPassword = "password" }; var source = new HTTPSourceBlock(settings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); var jpegDecoder = new JPEGDecoderBlock(); pipeline.Connect(source.Output, jpegDecoder.Input); pipeline.Connect(jpegDecoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux. ### HTTP MJPEG Source Block The HTTP MJPEG Source Block is specifically designed to connect to and decode MJPEG (Motion JPEG) video streams over HTTP/HTTPS. This is common for many IP cameras. #### Block info Name: HTTPMJPEGSourceBlock. | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; HTTPMJPEGSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create settings for the HTTP MJPEG source var mjpegSettings = await HTTPMJPEGSourceSettings.CreateAsync( new Uri("http://your-mjpeg-camera-url/stream"), // Replace with your camera's MJPEG stream URL "username", // Optional: username for camera authentication "password" // Optional: password for camera authentication ); if (mjpegSettings == null) { Console.WriteLine("Failed to initialize HTTP MJPEG settings."); return; } mjpegSettings.CustomVideoFrameRate = new VideoFrameRate(25); // Optional: Set if camera doesn't report frame rate mjpegSettings.Latency = TimeSpan.FromMilliseconds(200); // Optional: Adjust latency var httpMjpegSource = new HTTPMJPEGSourceBlock(mjpegSettings); // Create video renderer var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control // Connect blocks pipeline.Connect(httpMjpegSource.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Sample applications - Similar to HTTP MJPEG Source Demo mentioned under the generic HTTP Source Block. #### Platforms Windows, macOS, Linux. ### NDI Source Block The NDI source block supports connection to NDI software sources and devices supporting the NDI protocol. #### Block info Name: NDISourceBlock. | Pin direction | Media type | Pins count | |-----------------|:--------------------:|:-----------:| | Output audio | Uncompressed | 1 | | Output video | Uncompressed | 1 | #### The sample pipeline ```mermaid graph LR; NDISourceBlock-->VideoRendererBlock; NDISourceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // get NDI source info by enumerating sources var ndiSources = await DeviceEnumerator.Shared.NDISourcesAsync(); var ndiSourceInfo = ndiSources[0]; // create NDI source settings var ndiSettings = NDISourceSettings.CreateAsync(ndiSourceInfo); var ndiSource = new NDISourceBlock(ndiSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(ndiSource.VideoOutput, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(ndiSource.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Sample applications - [NDI Source Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/NDI%20Source%20Demo) #### Platforms Windows, macOS, Linux. ### GenICam Source Block The GenICam source supports connection to GigE, and the USB3 Vision camera supports the GenICam protocol. #### Block info Name: GenICamSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | various | one or more | #### The sample pipeline ```mermaid graph LR; GenICamSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var sourceSettings = new GenICamSourceSettings(cbCamera.Text, new VisioForge.Core.Types.Rect(0, 0, 512, 512), 15, GenICamPixelFormat.Mono8); var source = new GenICamSourceBlock(sourceSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(source.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Sample applications - [GenICam Source Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/GenICam%20Source%20Demo) #### Prerequisites ##### macOS Install the `Aravis` package using Homebrew: ```bash brew install aravis ``` ##### Linux Install the `Aravis` package using the package manager: ```bash sudo apt-get install libaravis-0.8-dev ``` ##### Windows Install the `VisioForge.CrossPlatform.GenICam.Windows.x64` package to your project using NuGet. #### Platforms Windows, macOS, Linux ### SRT Source Block (with decoding) The `Secure Reliable Transport (SRT)` is an open-source video streaming protocol designed for secure and low-latency delivery over unpredictable networks, like the public internet. Developed by Haivision, SRT optimizes streaming performance by dynamically adapting to varying bandwidths and minimizing the effects of packet loss. It incorporates AES encryption for secure content transmission. Primarily used in broadcasting and online streaming, SRT is crucial for delivering high-quality video feeds in real-time applications, enhancing viewer experiences even in challenging network conditions. It supports point-to-point and multicast streaming, making it versatile for diverse setups. The SRT source block provides decoded video and audio streams from an SRT source. #### Block info Name: SRTSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | Uncompressed | 0+ | | Output audio | Uncompressed | 0+ | #### Settings The `SRTSourceBlock` is configured using `SRTSourceSettings`. This class provides comprehensive options for SRT connections: - `Uri` (string): The SRT URI (e.g., "srt://127.0.0.1:8888" or "srt://example.com:9000?mode=listener"). Default is "srt://127.0.0.1:8888". - `Mode` (`SRTConnectionMode` enum): Specifies the SRT connection mode. Default is `Caller`. See `SRTConnectionMode` enum details below. - `Passphrase` (string): The password for encrypted transmission. - `PbKeyLen` (`SRTKeyLength` enum): The crypto key length for AES encryption. Default is `NoKey`. See `SRTKeyLength` enum details below. - `Latency` (`TimeSpan`): The maximum accepted transmission latency (receiver side for caller/listener, or for both in rendezvous). Default is 125 milliseconds. - `StreamId` (string): The stream ID for SRT access control. - `LocalAddress` (string): The local address to bind to when in `Listener` or `Rendezvous` mode. Default `null` (any). - `LocalPort` (uint): The local port to bind to when in `Listener` or `Rendezvous` mode. Default 7001. - `Authentication` (bool): Whether to authenticate the connection. Default `true`. - `AutoReconnect` (bool): Whether the source should attempt to reconnect if the connection fails. Default `true`. - `KeepListening` (bool): If `false` (default), the element will signal end-of-stream when the remote client disconnects (in listener mode). If `true`, it keeps waiting for reconnection. - `PollTimeout` (`TimeSpan`): Polling timeout used when an SRT poll is started. Default 1000 milliseconds. - `WaitForConnection` (bool): If `true` (default), blocks the stream until a client connects (in listener mode). The `SRTSourceSettings` can be initialized using `await SRTSourceSettings.CreateAsync(string uri, bool ignoreMediaInfoReader = false)`. Setting `ignoreMediaInfoReader` to `true` can be useful if media info reading fails for a live stream. ##### `SRTConnectionMode` Enum Defines the operational mode for an SRT connection: - `None` (0): No connection mode specified (should not typically be used directly). - `Caller` (1): The source initiates the connection to a listener. - `Listener` (2): The source waits for an incoming connection from a caller. - `Rendezvous` (3): Both ends initiate connection to each other simultaneously, useful for traversing firewalls. ##### `SRTKeyLength` Enum Defines the key length for SRT's AES encryption: - `NoKey` (0) / `Length0` (0): No encryption is used. - `Length16` (16): 16-byte (128-bit) AES encryption key. - `Length24` (24): 24-byte (192-bit) AES encryption key. - `Length32` (32): 32-byte (256-bit) AES encryption key. #### The sample pipeline ```mermaid graph LR; SRTSourceBlock-->VideoRendererBlock; SRTSourceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var source = new SRTSourceBlock(new SRTSourceSettings() { Uri = edURL.Text }); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(source.VideoOutput, videoRenderer.Input); pipeline.Connect(source.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Sample applications - [SRT Source Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/SRT%20Source%20Demo) #### Platforms Windows, macOS, Linux, iOS, Android. ### SRT RAW Source Block `The Secure Reliable Transport (SRT)` is a streaming protocol that optimizes video data delivery over unpredictable networks, like the Internet. It is open-source and designed to handle high-performance video and audio streaming. SRT provides security through end-to-end encryption, reliability by recovering lost packets, and low latency, which is suitable for live broadcasts. It adapts to varying network conditions by dynamically managing bandwidth, ensuring high-quality streams even under suboptimal conditions. Widely used in broadcasting and streaming applications, SRT supports interoperability and is ideal for remote production and content distribution. The SRT source supports connection to SRT sources and provides a data stream. You can connect this block to `DecodeBinBlock` to decode the stream. #### Block info Name: SRTRAWSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output data | Any | one | #### Settings The `SRTRAWSourceBlock` is configured using `SRTSourceSettings`. Refer to the detailed description of `SRTSourceSettings` and its related enums (`SRTConnectionMode`, `SRTKeyLength`) under the `SRT Source Block (with decoding)` section for all available properties and their explanations. #### The sample pipeline ```mermaid graph LR; SRTRAWSourceBlock-->DecodeBinBlock; DecodeBinBlock-->VideoRendererBlock; DecodeBinBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var source = new SRTRAWSourceBlock(new SRTSourceSettings() { Uri = edURL.Text }); var decodeBin = new DecodeBinBlock(); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(source.Output, decodeBin.Input); pipeline.Connect(decodeBin.VideoOutput, videoRenderer.Input); pipeline.Connect(decodeBin.AudioOutput, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Other Source Blocks ### Screen Source Block Screen source supports recording video from the screen. You can select the display (if more than one), the part of the screen to be recorded, and optional mouse cursor recording. #### Settings The `ScreenSourceBlock` uses platform-specific settings classes. The choice of settings class determines the underlying screen capture technology. The `ScreenCaptureSourceType` enum indicates the available technologies: ##### Windows - `ScreenCaptureDX9SourceSettings` - Use `DirectX 9` for screen recording. (`ScreenCaptureSourceType.DX9`) - `ScreenCaptureD3D11SourceSettings` - Use `Direct3D 11` Desktop Duplication for screen recording. Allows specific window capture. (`ScreenCaptureSourceType.D3D11DesktopDuplication`) - `ScreenCaptureGDISourceSettings` - Use `GDI` for screen recording. (`ScreenCaptureSourceType.GDI`) ##### macOS `ScreenCaptureMacOSSourceSettings` - Use `AVFoundation` for screen recording. (`ScreenCaptureSourceType.AVFoundation`) ##### Linux `ScreenCaptureXDisplaySourceSettings` - Use `X11` (XDisplay) for screen recording. (`ScreenCaptureSourceType.XDisplay`) ##### iOS `IOSScreenSourceSettings` - Use `AVFoundation` for current window/app recording. (`ScreenCaptureSourceType.IOSScreen`) #### Block info Name: ScreenSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; ScreenSourceBlock-->H264EncoderBlock; H264EncoderBlock-->MP4SinkBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // create source settings var screenSourceSettings = new ScreenCaptureDX9SourceSettings() { FrameRate = 15 } // create source block var screenSourceBlock = new ScreenSourceBlock(screenSourceSettings); // create video encoder block and connect it to the source block var h264EncoderBlock = new H264EncoderBlock(new OpenH264EncoderSettings()); pipeline.Connect(screenSourceBlock.Output, h264EncoderBlock.Input); // create MP4 sink block and connect it to the encoder block var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(h264EncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); // run pipeline await pipeline.StartAsync(); ``` #### [Windows] Window capture You can capture a specific window by using the `ScreenCaptureD3D11SourceSettings` class. ```csharp // create Direct3D11 source var source = new ScreenCaptureD3D11SourceSettings(); // set frame rate source.FrameRate = new VideoFrameRate(30); // get handle of the window var wih = new System.Windows.Interop.WindowInteropHelper(this); source.WindowHandle = wih.Handle; // create source block var screenSourceBlock = new ScreenSourceBlock(new ScreenCaptureDX9SourceSettings() { FrameRate = 15 }); // other code is the same as above ``` #### Sample applications - [Screen Capture Demo (WPF)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Screen%20Capture) - [Screen Capture Demo (MAUI)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/MAUI/ScreenCaptureMB) - [Screen Capture Demo (iOS)](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/iOS/ScreenCapture) #### Platforms Windows, macOS, Linux, iOS. ### Virtual Video Source Block VirtualVideoSourceBlock is used to produce test video data in a wide variety of video formats. The type of test data is controlled by the settings. #### Settings The `VirtualVideoSourceBlock` is configured using `VirtualVideoSourceSettings`. Key properties: - `Pattern` (`VirtualVideoSourcePattern` enum): Specifies the type of test pattern to generate. See `VirtualVideoSourcePattern` enum below for available patterns. Default is `SMPTE`. - `Width` (int): Width of the output video (default 1280). - `Height` (int): Height of the output video (default 720). - `FrameRate` (`VideoFrameRate`): Frame rate of the output video (default 30fps). - `Format` (`VideoFormatX` enum): Pixel format of the video (default `RGB`). - `ForegroundColor` (`SKColor`): For patterns that use a foreground color (e.g., `SolidColor`), this property defines it (default `SKColors.White`). Constructors: - `VirtualVideoSourceSettings()`: Default constructor. - `VirtualVideoSourceSettings(int width, int height, VideoFrameRate frameRate)`: Initializes with specified dimensions and frame rate. ##### `VirtualVideoSourcePattern` Enum Defines the test pattern generated by `VirtualVideoSourceBlock`: - `SMPTE` (0): SMPTE 100% color bars. - `Snow` (1): Random (television snow). - `Black` (2): 100% Black. - `White` (3): 100% White. - `Red` (4), `Green` (5), `Blue` (6): Solid colors. - `Checkers1` (7) to `Checkers8` (10): Checkerboard patterns with 1, 2, 4, or 8 pixel squares. - `Circular` (11): Circular pattern. - `Blink` (12): Blinking pattern. - `SMPTE75` (13): SMPTE 75% color bars. - `ZonePlate` (14): Zone plate. - `Gamut` (15): Gamut checkers. - `ChromaZonePlate` (16): Chroma zone plate. - `SolidColor` (17): A solid color, defined by `ForegroundColor`. - `Ball` (18): Moving ball. - `SMPTE100` (19): Alias for SMPTE 100% color bars. - `Bar` (20): Bar pattern. - `Pinwheel` (21): Pinwheel pattern. - `Spokes` (22): Spokes pattern. - `Gradient` (23): Gradient pattern. - `Colors` (24): Various colors pattern. - `SMPTERP219` (25): SMPTE test pattern, RP 219 conformant. #### Block info Name: VirtualVideoSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output video | uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoSourceBlock.Output, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Virtual Audio Source Block VirtualAudioSourceBlock is used to produce test audio data in a wide variety of audio formats. The type of test data is controlled by the settings. #### Settings The `VirtualAudioSourceBlock` is configured using `VirtualAudioSourceSettings`. Key properties: - `Wave` (`VirtualAudioSourceSettingsWave` enum): Specifies the type of audio waveform to generate. See `VirtualAudioSourceSettingsWave` enum below. Default `Sine`. - `Format` (`AudioFormatX` enum): Audio sample format (default `S16LE`). - `SampleRate` (int): Sample rate in Hz (default 48000). - `Channels` (int): Number of audio channels (default 2). - `Volume` (double): Volume of the test signal (0.0 to 1.0, default 0.8). - `Frequency` (double): Frequency of the test signal in Hz (e.g., for Sine wave, default 440). - `IsLive` (bool): Indicates if the source is live (default `true`). - `ApplyTickRamp` (bool): Apply ramp to tick samples (default `false`). - `CanActivatePull` (bool): Can activate in pull mode (default `false`). - `CanActivatePush` (bool): Can activate in push mode (default `true`). - `MarkerTickPeriod` (uint): Make every Nth tick a marker tick (for `Ticks` wave, 0 = no marker, default 0). - `MarkerTickVolume` (double): Volume of marker ticks (default 1.0). - `SamplesPerBuffer` (int): Number of samples in each outgoing buffer (default 1024). - `SinePeriodsPerTick` (uint): Number of sine wave periods in one tick (for `Ticks` wave, default 10). - `TickInterval` (`TimeSpan`): Distance between start of current and start of next tick (default 1 second). - `TimestampOffset` (`TimeSpan`): An offset added to timestamps (default `TimeSpan.Zero`). Constructor: - `VirtualAudioSourceSettings(VirtualAudioSourceSettingsWave wave = VirtualAudioSourceSettingsWave.Ticks, int sampleRate = 48000, int channels = 2, AudioFormatX format = AudioFormatX.S16LE)` ##### `VirtualAudioSourceSettingsWave` Enum Defines the waveform for `VirtualAudioSourceBlock`: - `Sine` (0): Sine wave. - `Square` (1): Square wave. - `Saw` (2): Sawtooth wave. - `Triangle` (3): Triangle wave. - `Silence` (4): Silence. - `WhiteNoise` (5): White uniform noise. - `PinkNoise` (6): Pink noise. - `SineTable` (7): Sine table. - `Ticks` (8): Periodic Ticks. - `GaussianNoise` (9): White Gaussian noise. - `RedNoise` (10): Red (Brownian) noise. - `BlueNoise` (11): Blue noise. - `VioletNoise` (12): Violet noise. #### Block info Name: VirtualAudioSourceBlock. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Output audio | uncompressed audio | 1 | #### The sample pipeline ```mermaid graph LR; VirtualAudioSourceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var audioSourceBlock = new VirtualAudioSourceBlock(new VirtualAudioSourceSettings()); var videoSourceBlock = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoSourceBlock.Output, videoRenderer.Input); var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioSourceBlock.Output, audioRenderer.Input); await pipeline.StartAsync(); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Demuxer Source Block The Demuxer Source Block is used to demultiplex local media files into their constituent elementary streams (video, audio, subtitles). It allows for selective rendering of these streams. #### Block info Name: DemuxerSourceBlock. | Pin direction | Media type | Pins count | |-----------------|:--------------------:|:-----------:| | Output video | Depends on file | 0 or 1 | | Output audio | Depends on file | 0 or 1+ | | Output subtitle | Depends on file | 0 or 1+ | #### The sample pipeline ```mermaid graph LR; DemuxerSourceBlock -- Video Stream --> VideoRendererBlock; DemuxerSourceBlock -- Audio Stream --> AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create settings, ensure to await CreateAsync var demuxerSettings = await DemuxerSourceSettings.CreateAsync( "path/to/your/video.mp4", renderVideo: true, renderAudio: true, renderSubtitle: false); if (demuxerSettings == null) { Console.WriteLine("Failed to initialize demuxer settings. Ensure the file exists and is readable."); return; } var demuxerSource = new DemuxerSourceBlock(demuxerSettings); // Setup video rendering if video is available and rendered if (demuxerSettings.RenderVideo && demuxerSource.VideoOutput != null) { var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control pipeline.Connect(demuxerSource.VideoOutput, videoRenderer.Input); } // Setup audio rendering if audio is available and rendered if (demuxerSettings.RenderAudio && demuxerSource.AudioOutput != null) { var audioRenderer = new AudioRendererBlock(); pipeline.Connect(demuxerSource.AudioOutput, audioRenderer.Input); } // Start pipeline await pipeline.StartAsync(); ``` #### Sample applications - No specific sample application link, but can be used in player-like scenarios. #### Platforms Windows, macOS, Linux, iOS, Android. ### Image Video Source Block The Image Video Source Block generates a video stream from a static image file (e.g., JPG, PNG). It repeatedly outputs the image as video frames according to the specified frame rate. #### Block info Name: ImageVideoSourceBlock. | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; ImageVideoSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create image video source settings var imageSourceSettings = new ImageVideoSourceSettings("path/to/your/image.jpg"); // Replace with your image path imageSourceSettings.FrameRate = new VideoFrameRate(10); // Output 10 frames per second imageSourceSettings.IsLive = true; // Treat as a live source (optional) // imageSourceSettings.NumBuffers = 100; // Optional: output only 100 frames then stop var imageSource = new ImageVideoSourceBlock(imageSourceSettings); // Create video renderer var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 is your display control // Connect blocks pipeline.Connect(imageSource.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Remarks This block uses SkiaSharp for image decoding, so ensure necessary dependencies are met if not using a standard VisioForge package that includes it. #### Platforms Windows, macOS, Linux, iOS, Android. ## Push Source Blocks Push Source blocks allow you to feed media data (video, audio, JPEG images, or generic data) directly into the Media Blocks pipeline from your application code. This is useful when your media originates from a custom source, such as a proprietary capture device, a network stream not supported by built-in blocks, or procedurally generated content. The behavior of push sources is generally controlled by common settings available through the `IPushSourceSettings` interface, implemented by specific push source settings classes: - `IsLive` (bool): Indicates if the source is live. Defaults vary by type (e.g., `true` for audio/video). - `DoTimestamp` (bool): If `true`, the block will attempt to generate timestamps for the pushed data. - `StreamType` (`PushSourceStreamType` enum: `Stream` or `SeekableStream`): Defines the stream characteristics. - `PushFormat` (`PushSourceFormat` enum: `Bytes`, `Time`, `Default`, `Automatic`): Controls how data is pushed (e.g., based on byte count or time). - `BlockPushData` (bool): If `true`, the push operation will block until the data is consumed by the pipeline. The specific type of push source is determined by the `PushSourceType` enum: `Video`, `Audio`, `Data`, `JPEG`. ### Push Video Source Block Allows pushing raw video frames into the pipeline. #### Block info Name: `PushSourceBlock` (configured for video). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output video | Uncompressed video | 1 | #### Settings Configured using `PushVideoSourceSettings`: - `Width` (int): Width of the video frames. - `Height` (int): Height of the video frames. - `FrameRate` (`VideoFrameRate`): Frame rate of the video. - `Format` (`VideoFormatX` enum): Pixel format of the video frames (e.g., `RGB`, `NV12`, `I420`). - Inherits common push settings like `IsLive` (defaults to `true`), `DoTimestamp`, `StreamType`, `PushFormat`, `BlockPushData`. Constructor: `PushVideoSourceSettings(int width, int height, VideoFrameRate frameRate, VideoFormatX format = VideoFormatX.RGB)` #### The sample pipeline ```mermaid graph LR; PushVideoSourceBlock-->VideoEncoderBlock-->MP4SinkBlock; PushVideoSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Configure push video source var videoPushSettings = new PushVideoSourceSettings( width: 640, height: 480, frameRate: new VideoFrameRate(30), format: VideoFormatX.RGB); // videoPushSettings.IsLive = true; // Default var videoPushSource = new PushSourceBlock(videoPushSettings); // Example: Render the pushed video var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoPushSource.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); // In a separate thread or task, push video frames: // byte[] frameData = ... ; // Your raw RGB frame data (640 * 480 * 3 bytes) // videoPushSource.PushFrame(frameData); // Call PushFrame repeatedly for each new video frame. ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Push Audio Source Block Allows pushing raw audio samples into the pipeline. #### Block info Name: `PushSourceBlock` (configured for audio). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output audio | Uncompressed audio | 1 | #### Settings Configured using `PushAudioSourceSettings`: - `SampleRate` (int): Sample rate of the audio (e.g., 44100, 48000). - `Channels` (int): Number of audio channels (e.g., 1 for mono, 2 for stereo). - `Format` (`AudioFormatX` enum): Format of the audio samples (e.g., `S16LE` for 16-bit signed little-endian PCM). - Inherits common push settings like `IsLive` (defaults to `true`), `DoTimestamp`, `StreamType`, `PushFormat`, `BlockPushData`. Constructor: `PushAudioSourceSettings(bool isLive = true, int sampleRate = 48000, int channels = 2, AudioFormatX format = AudioFormatX.S16LE)` #### The sample pipeline ```mermaid graph LR; PushAudioSourceBlock-->AudioEncoderBlock-->MP4SinkBlock; PushAudioSourceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Configure push audio source var audioPushSettings = new PushAudioSourceSettings( isLive: true, sampleRate: 44100, channels: 2, format: AudioFormatX.S16LE); var audioPushSource = new PushSourceBlock(audioPushSettings); // Example: Render the pushed audio var audioRenderer = new AudioRendererBlock(); pipeline.Connect(audioPushSource.Output, audioRenderer.Input); // Start pipeline await pipeline.StartAsync(); // In a separate thread or task, push audio samples: // byte[] audioData = ... ; // Your raw PCM S16LE audio data // audioPushSource.PushFrame(audioData); // Call PushFrame repeatedly for new audio data. ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Push Data Source Block Allows pushing generic byte data into the pipeline. The interpretation of this data depends on the `Caps` (capabilities) specified. #### Block info Name: `PushSourceBlock` (configured for data). | Pin direction | Media type | Pins count | |---------------|:----------:|:----------:| | Output data | Custom | 1 | #### Settings Configured using `PushDataSourceSettings`: - `Caps` (`Gst.Caps`): GStreamer capabilities string describing the data format (e.g., "video/x-h264, stream-format=byte-stream"). This is crucial for downstream blocks to understand the data. - `PadMediaType` (`MediaBlockPadMediaType` enum): Specifies the type of the output pad (e.g., `Video`, `Audio`, `Data`, `Auto`). - Inherits common push settings like `IsLive`, `DoTimestamp`, `StreamType`, `PushFormat`, `BlockPushData`. #### The sample pipeline ```mermaid graph LR; PushDataSourceBlock-->ParserOrDecoder-->Renderer; ``` Example: Pushing H.264 Annex B byte stream ```mermaid graph LR; PushDataSourceBlock-->H264ParserBlock-->H264DecoderBlock-->VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Configure push data source for H.264 byte stream var dataPushSettings = new PushDataSourceSettings(); dataPushSettings.Caps = new Gst.Caps("video/x-h264, stream-format=(string)byte-stream"); dataPushSettings.PadMediaType = MediaBlockPadMediaType.Video; // dataPushSettings.IsLive = true; // Set if live var dataPushSource = new PushSourceBlock(dataPushSettings); // Example: Decode and render H.264 stream var h264Parser = new H264ParserBlock(); var h264Decoder = new H264DecoderBlock(); // Or OpenH264DecoderBlock, etc. var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(dataPushSource.Output, h264Parser.Input); pipeline.Connect(h264Parser.Output, h264Decoder.Input); pipeline.Connect(h264Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); // In a separate thread or task, push H.264 NALUs: // byte[] naluData = ... ; // Your H.264 NALU data // dataPushSource.PushFrame(naluData); ``` #### Platforms Windows, macOS, Linux, iOS, Android. ### Push JPEG Source Block Allows pushing individual JPEG images, which are then output as a video stream. #### Block info Name: `PushSourceBlock` (configured for JPEG). | Pin direction | Media type | Pins count | |---------------|:--------------------:|:----------:| | Output video | Uncompressed video | 1 | #### Settings Configured using `PushJPEGSourceSettings`: - `Width` (int): Width of the decoded JPEG images. - `Height` (int): Height of the decoded JPEG images. - `FrameRate` (`VideoFrameRate`): The frame rate at which the JPEG images will be presented as a video stream. - Inherits common push settings like `IsLive` (defaults to `true`), `DoTimestamp`, `StreamType`, `PushFormat`, `BlockPushData`. Constructor: `PushJPEGSourceSettings(int width, int height, VideoFrameRate frameRate)` #### The sample pipeline ```mermaid graph LR; PushJPEGSourceBlock-->VideoRendererBlock; PushJPEGSourceBlock-->VideoEncoderBlock-->MP4SinkBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Configure push JPEG source var jpegPushSettings = new PushJPEGSourceSettings( width: 1280, height: 720, frameRate: new VideoFrameRate(10)); // Present JPEGs as a 10 FPS video var jpegPushSource = new PushSourceBlock(jpegPushSettings); // Example: Render the video stream from JPEGs var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(jpegPushSource.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); // In a separate thread or task, push JPEG image data: // byte[] jpegImageData = File.ReadAllBytes("image.jpg"); // jpegPushSource.PushFrame(jpegImageData); // Call PushFrame for each new JPEG image. ``` #### Platforms Windows, macOS, Linux, iOS, Android. ## Apple Platform Source Blocks ### iOS Video Source Block iOSVideoSourceBlock provides video capture from the device camera on iOS platforms. It is available only on iOS (not macOS Catalyst). #### Block info Name: IOSVideoSourceBlock. | Pin direction | Media type | Pins count | |---------------|:------------------:|:----------:| | Output video | Uncompressed video | 1 | #### Enumerate available devices Use `DeviceEnumerator.Shared.VideoSourcesAsync()` to get a list of available video devices on iOS. Each device is represented by a `VideoCaptureDeviceInfo` object. #### The sample pipeline ```mermaid graph LR; IOSVideoSourceBlock-->VideoRendererBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // select the first available video device var device = (await DeviceEnumerator.Shared.VideoSourcesAsync())[0]; VideoCaptureDeviceSourceSettings videoSourceSettings = null; if (device != null) { var formatItem = device.VideoFormats[0]; if (formatItem != null) { videoSourceSettings = new VideoCaptureDeviceSourceSettings(device) { Format = formatItem.ToFormat() }; videoSourceSettings.Format.FrameRate = formatItem.FrameRateList[0]; } } // create iOS video source block var videoSource = new IOSVideoSourceBlock(videoSourceSettings); // create video renderer block var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // connect blocks pipeline.Connect(videoSource.Output, videoRenderer.Input); // start pipeline await pipeline.StartAsync(); ``` #### Platforms iOS (not available on macOS Catalyst) --- ### macOS Audio Source Block OSXAudioSourceBlock provides audio capture from input devices on macOS platforms. #### Block info Name: OSXAudioSourceBlock. | Pin direction | Media type | Pins count | |---------------|:------------------:|:----------:| | Output audio | Uncompressed audio | 1 | #### Enumerate available devices Use `DeviceEnumerator.Shared.AudioSourcesAsync()` to get a list of available audio devices on macOS. Each device is represented by an `AudioCaptureDeviceInfo` object. #### The sample pipeline ```mermaid graph LR; OSXAudioSourceBlock-->AudioRendererBlock; ``` #### Sample code ```csharp // create pipeline var pipeline = new MediaBlocksPipeline(); // select the first available audio device var devices = await DeviceEnumerator.Shared.AudioSourcesAsync(); var device = devices.Length > 0 ? devices[0] : null; OSXAudioSourceSettings audioSourceSettings = null; if (device != null) { var formatItem = device.Formats[0]; if (formatItem != null) { audioSourceSettings = new OSXAudioSourceSettings(device.DeviceID, formatItem); } } // create macOS audio source block var audioSource = new OSXAudioSourceBlock(audioSourceSettings); // create audio renderer block var audioRenderer = new AudioRendererBlock(); // connect blocks pipeline.Connect(audioSource.Output, audioRenderer.Input); // start pipeline await pipeline.StartAsync(); ``` #### Platforms macOS (not available on iOS) ---END OF PAGE--- # Local File: .\dotnet\mediablocks\Special\index.md --- title: Special .Net Media Blocks & Customization description: Discover special media blocks like Null Renderer, Tee, and Super MediaBlock in the VisioForge Media Blocks SDK for .Net. Learn to customize media pipelines with advanced settings for encryption, custom GStreamer elements, and input source switching. sidebar_label: Special Blocks --- # Special blocks [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Introduction Special blocks are blocks that do not fit into any other category. ## Null Renderer The null renderer block sends the data to null. This block may be required if your block has outputs you do not want to use. ### Block info Name: NullRendererBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Any | 1 ### The sample pipeline The sample pipeline is shown below. It reads a file and sends the video data to the video samples grabber, where you can grab each video frame after decoding. The Null renderer block is used to end the pipeline. ```mermaid graph LR; UniversalSourceBlock-->VideoSampleGrabberBlock; VideoSampleGrabberBlock-->NullRendererBlock; ``` ### Sample code ```csharp private void Start() { // create the pipeline var pipeline = new MediaBlocksPipeline(); // create universal source block var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // create video sample grabber block and add the event handler var sampleGrabber = new VideoSampleGrabberBlock(); sampleGrabber.OnVideoFrameBuffer += sampleGrabber_OnVideoFrameBuffer; // create null renderer block var nullRenderer = new NullRendererBlock(); // connect blocks pipeline.Connect(fileSource.VideoOutput, sampleGrabber.Input); pipeline.Connect(sampleGrabber.Output, nullRenderer.Input); // start the pipeline await pipeline.StartAsync(); } private void sampleGrabber_OnVideoFrameBuffer(object sender, VideoFrameXBufferEventArgs e) { // received new video frame } ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Tee The tee block splits the video or audio data stream into multiple streams that completely copy the original stream. ### Block info Name: TeeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Any | 1 Output | Same as input | 2 or more ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TeeBlock; TeeBlock-->VideoRendererBlock; TeeBlock-->H264EncoderBlock; H264EncoderBlock-->MP4SinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoTee = new TeeBlock(2); var h264Encoder = new H264EncoderBlock(new OpenH264EncoderSettings()); var mp4Muxer = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(fileSource.VideoOutput, videoTee.Input); pipeline.Connect(videoTee.Outputs[0], videoRenderer.Input); pipeline.Connect(videoTee.Outputs[1], h264Encoder.Input); pipeline.Connect(h264Encoder.Output, mp4Muxer.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Sample applications - [Simple Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) ### Platforms Windows, macOS, Linux, iOS, Android. ## Super MediaBlock The Super MediaBlock allows you to combine multiple blocks into a single block. ### Block info Name: SuperMediaBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Any | 1 Output | Any | 1 ### The sample pipeline ```mermaid graph LR; VirtualVideoSourceBlock-->SuperMediaBlock; SuperMediaBlock-->NullRendererBlock; ``` Inside the SuperMediaBlock: ```mermaid graph LR; FishEyeBlock-->ColorEffectsBlock; ``` Final pipeline: ```mermaid graph LR; VirtualVideoSourceBlock-->FishEyeBlock; subgraph SuperMediaBlock FishEyeBlock-->ColorEffectsBlock; end ColorEffectsBlock-->NullRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var videoViewBlock = new VideoRendererBlock(pipeline, VideoView1); var videoSource = new VirtualVideoSourceBlock(new VirtualVideoSourceSettings()); var colorEffectsBlock = new ColorEffectsBlock(VisioForge.Core.Types.X.VideoEffects.ColorEffectsPreset.Sepia); var fishEyeBlock = new FishEyeBlock(); var superBlock = new SuperMediaBlock(); superBlock.Blocks.Add(fishEyeBlock); superBlock.Blocks.Add(colorEffectsBlock); superBlock.Configure(pipeline); pipeline.Connect(videoSource.Output, superBlock.Input); pipeline.Connect(superBlock.Output, videoViewBlock.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## AESCipher The `AESCipher` enum defines the types of AES ciphers that can be used. (Source: `VisioForge.Core/Types/X/Special/AESCipher.cs`) ### Enum Values - `AES_128`: AES 128-bit cipher key using CBC method. - `AES_256`: AES 256-bit cipher key using CBC method. ### Platforms Windows, macOS, Linux, iOS, Android. ## EncryptorDecryptorSettings The `EncryptorDecryptorSettings` class holds settings for encryption and decryption operations. (Source: `VisioForge.Core/Types/X/Special/EncryptorDecryptorSettings.cs`) ### Properties - `Cipher` (`AESCipher`): Gets or sets the AES cipher type. Defaults to `AES_128`. - `Key` (`string`): Gets or sets the encryption key. - `IV` (`string`): Gets or sets the initialization vector (16 bytes as hex). - `SerializeIV` (`bool`): Gets or sets a value indicating whether to serialize the IV. ### Constructor - `EncryptorDecryptorSettings(string key, string iv)`: Initializes a new instance with the given key and initialization vector. ### Platforms Windows, macOS, Linux, iOS, Android. ## CustomMediaBlockPad The `CustomMediaBlockPad` class defines information for a pad within a `CustomMediaBlock`. (Source: `VisioForge.Core/Types/X/Special/CustomMediaBlockPad.cs`) ### Properties - `Direction` (`MediaBlockPadDirection`): Gets or sets the pad direction (input/output). - `MediaType` (`MediaBlockPadMediaType`): Gets or sets the media type of the pad (e.g., Audio, Video). - `CustomCaps` (`Gst.Caps`): Gets or sets custom GStreamer capabilities for an output pad. ### Constructor - `CustomMediaBlockPad(MediaBlockPadDirection direction, MediaBlockPadMediaType mediaType)`: Initializes a new instance with the specified direction and media type. ### Platforms Windows, macOS, Linux, iOS, Android. ## CustomMediaBlockSettings The `CustomMediaBlockSettings` class provides settings for configuring a custom media block, potentially wrapping GStreamer elements. (Source: `VisioForge.Core/Types/X/Special/CustomMediaBlockSettings.cs`) ### Properties - `ElementName` (`string`): Gets the name of the GStreamer element or Media Blocks SDK element. To create a custom GStreamer Bin, include square brackets, e.g., `"[ videotestsrc ! videoconvert ]"`. - `UsePadAddedEvent` (`bool`): Gets or sets a value indicating whether to use the `pad-added` event for dynamically created GStreamer pads. - `ElementParams` (`Dictionary`): Gets the parameters for the element. - `Pads` (`List`): Gets the list of `CustomMediaBlockPad` definitions for the block. - `ListProperties` (`bool`): Gets or sets a value indicating whether to list element properties to the Debug window after creation. Defaults to `false`. ### Constructor - `CustomMediaBlockSettings(string elementName)`: Initializes a new instance with the specified element name. ### Platforms Windows, macOS, Linux, iOS, Android. ## InputSelectorSyncMode The `InputSelectorSyncMode` enum defines how an input-selector (used by `SourceSwitchSettings`) synchronizes buffers when in `sync-streams` mode. (Source: `VisioForge.Core/Types/X/Special/SourceSwitchSettings.cs`) ### Enum Values - `ActiveSegment` (0): Sync using the current active segment. - `Clock` (1): Sync using the clock. ### Platforms Windows, macOS, Linux, iOS, Android. ## SourceSwitchSettings The `SourceSwitchSettings` class configures a block that can switch between multiple input sources. (Source: `VisioForge.Core/Types/X/Special/SourceSwitchSettings.cs`) The summary "Represents the currently active sink pad" from the code comment might be slightly misleading or incomplete for the class name `SourceSwitchSettings`. The properties suggest it's for configuring a source switcher. ### Properties - `PadsCount` (`int`): Gets or sets the number of input pads. Defaults to `2`. - `DefaultActivePad` (`int`): Gets or sets the initially active sink pad. - `CacheBuffers` (`bool`): Gets or sets whether the active pad caches buffers to avoid missing frames when reactivated. Defaults to `false`. - `DropBackwards` (`bool`): Gets or sets whether to drop buffers that go backwards relative to the last output buffer pre-switch. Defaults to `false`. - `SyncMode` (`InputSelectorSyncMode`): Gets or sets how the input-selector syncs buffers in `sync-streams` mode. Defaults to `InputSelectorSyncMode.ActiveSegment`. - `SyncStreams` (`bool`): Gets or sets if all inactive streams will be synced to the running time of the active stream or to the current clock. Defaults to `true`. - `CustomName` (`string`): Gets or sets a custom name for logging purposes. Defaults to `"SourceSwitch"`. ### Constructor - `SourceSwitchSettings(int padsCount = 2)`: Initializes a new instance, optionally specifying the number of pads. ### Platforms Windows, macOS, Linux, iOS, Android. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\VideoDecoders\index.md --- title: .Net Media Video Decoder Blocks Guide description: Explore a complete guide to .Net Media SDK video decoder blocks. Learn about various video decoders for your media processing pipelines. sidebar_label: Video Decoders --- # Video Decoder Blocks - VisioForge Media Blocks SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Video Decoder blocks are essential components in a media pipeline, responsible for decompressing encoded video streams into raw video frames that can be further processed or rendered. VisioForge Media Blocks SDK .Net offers a variety of video decoder blocks supporting numerous codecs and hardware acceleration technologies. ## Available Video Decoder Blocks ### H264 Decoder Block Decodes H.264 (AVC) video streams. This is one of the most widely used video compression standards. The block can utilize different underlying decoder implementations like FFMPEG, OpenH264, or hardware-accelerated decoders if available. #### Block info Name: `H264DecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | H.264 encoded video | 1 | | Output video | Uncompressed video | 1 | #### Settings The `H264DecoderBlock` is configured using settings that implement `IH264DecoderSettings`. Available settings classes include: - `FFMPEGH264DecoderSettings` - `OpenH264DecoderSettings` - `NVH264DecoderSettings` (for NVIDIA GPU acceleration) - `VAAPIH264DecoderSettings` (for VA-API acceleration on Linux) A constructor without parameters will attempt to select an available decoder automatically. #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- H.264 Video Stream --> H264DecoderBlock; H264DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create H264 Decoder block var h264Decoder = new H264DecoderBlock(); // Example: Create a basic file source, demuxer, and renderer var basicFileSource = new BasicFileSourceBlock("test_h264.mp4"); // You'll need MediaFileInfo, typically obtained using MediaInfoReader // Assuming MediaInfoReader.GetMediaInfoAsync is available: var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_h264.mp4"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), h264Decoder.Input); pipeline.Connect(h264Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability You can check for specific decoder implementations using `H264Decoder.IsAvailable(H264DecoderType decoderType)`. `H264DecoderType` includes `FFMPEG`, `OpenH264`, `GPU_Nvidia_H264`, `VAAPI_H264`, etc. #### Platforms Windows, macOS, Linux. (Hardware-specific decoders like NVH264Decoder require specific hardware and drivers). ### JPEG Decoder Block Decodes JPEG (Motion JPEG) video streams or individual JPEG images into raw video frames. #### Block info Name: `JPEGDecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | JPEG encoded video/images | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; HTTPSourceBlock -- MJPEG Stream --> JPEGDecoderBlock; JPEGDecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create JPEG Decoder block var jpegDecoder = new JPEGDecoderBlock(); // Example: Create an HTTP source for an MJPEG camera and a video renderer var httpSettings = new HTTPSourceSettings(new Uri("http://your-mjpeg-camera/stream")); var httpSource = new HTTPSourceBlock(httpSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Assuming VideoView1 // Connect blocks pipeline.Connect(httpSource.Output, jpegDecoder.Input); pipeline.Connect(jpegDecoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability You can check if the underlying NVIDIA JPEG decoder (if applicable) is available using `NVJPEGDecoder.IsAvailable()`. The generic JPEG decoder functionality is generally available. #### Platforms Windows, macOS, Linux. (NVIDIA specific implementation requires NVIDIA hardware). ### NVIDIA H.264 Decoder Block (NVH264DecoderBlock) Provides hardware-accelerated decoding of H.264 (AVC) video streams using NVIDIA's NVDEC technology. This offers high performance and efficiency on systems with compatible NVIDIA GPUs. #### Block info Name: `NVH264DecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | H.264 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- H.264 Video Stream --> NVH264DecoderBlock; NVH264DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create NVIDIA H.264 Decoder block var nvH264Decoder = new NVH264DecoderBlock(); // Example: Create a basic file source, demuxer, and renderer var basicFileSource = new BasicFileSourceBlock("test_h264.mp4"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_h264.mp4"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), nvH264Decoder.Input); pipeline.Connect(nvH264Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability Check availability using `NVH264Decoder.IsAvailable()`. Requires an NVIDIA GPU that supports NVDEC and appropriate drivers. #### Platforms Windows, Linux (with NVIDIA drivers). ### NVIDIA H.265 Decoder Block (NVH265DecoderBlock) Provides hardware-accelerated decoding of H.265 (HEVC) video streams using NVIDIA's NVDEC technology. H.265 offers better compression efficiency than H.264. #### Block info Name: `NVH265DecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | H.265/HEVC encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- H.265 Video Stream --> NVH265DecoderBlock; NVH265DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create NVIDIA H.265 Decoder block var nvH265Decoder = new NVH265DecoderBlock(); // Example: Create a basic file source, demuxer, and renderer var basicFileSource = new BasicFileSourceBlock("test_h265.mp4"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_h265.mp4"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), nvH265Decoder.Input); pipeline.Connect(nvH265Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability Check availability using `NVH265Decoder.IsAvailable()`. Requires an NVIDIA GPU that supports NVDEC for H.265 and appropriate drivers. #### Platforms Windows, Linux (with NVIDIA drivers). ### NVIDIA JPEG Decoder Block (NVJPEGDecoderBlock) Provides hardware-accelerated decoding of JPEG images or Motion JPEG (MJPEG) streams using NVIDIA's NVJPEG library. This is particularly useful for high-resolution or high-framerate MJPEG streams. (Note: The sample pipeline for JPEG with BasicFileSourceBlock might be less common than HTTPSource for MJPEG. The example below is adapted but consider typical use cases.) #### Block info Name: `NVJPEGDecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | JPEG encoded video/images | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw MJPEG Data --> UniversalDemuxBlock; UniversalDemuxBlock -- JPEG Video Stream --> NVJPEGDecoderBlock; NVJPEGDecoderBlock -- Decoded Video --> VideoRendererBlock; ``` For live MJPEG streams, `HTTPSourceBlock --> NVJPEGDecoderBlock` is more typical. #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create NVIDIA JPEG Decoder block var nvJpegDecoder = new NVJPEGDecoderBlock(); // Example: Create a basic file source for an MJPEG file, demuxer, and renderer // Ensure "test.mjpg" contains a Motion JPEG stream. var basicFileSource = new BasicFileSourceBlock("test.mjpg"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test.mjpg"); if (mediaInfo == null || mediaInfo.VideoStreams.Count == 0 || !mediaInfo.VideoStreams[0].Codec.Contains("jpeg")) { Console.WriteLine("Failed to get MJPEG media info or not an MJPEG file."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), nvJpegDecoder.Input); pipeline.Connect(nvJpegDecoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability Check availability using `NVJPEGDecoder.IsAvailable()`. Requires an NVIDIA GPU and appropriate drivers. #### Platforms Windows, Linux (with NVIDIA drivers). ### NVIDIA MPEG-1 Decoder Block (NVMPEG1DecoderBlock) Provides hardware-accelerated decoding of MPEG-1 video streams using NVIDIA's NVDEC technology. #### Block info Name: `NVMPEG1DecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | MPEG-1 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- MPEG-1 Video Stream --> NVMPEG1DecoderBlock; NVMPEG1DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create NVIDIA MPEG-1 Decoder block var nvMpeg1Decoder = new NVMPEG1DecoderBlock(); // Example: Create a basic file source, demuxer, and renderer var basicFileSource = new BasicFileSourceBlock("test_mpeg1.mpg"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_mpeg1.mpg"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), nvMpeg1Decoder.Input); pipeline.Connect(nvMpeg1Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability Check availability using `NVMPEG1Decoder.IsAvailable()`. Requires an NVIDIA GPU that supports NVDEC for MPEG-1 and appropriate drivers. #### Platforms Windows, Linux (with NVIDIA drivers). ### NVIDIA MPEG-2 Decoder Block (NVMPEG2DecoderBlock) Provides hardware-accelerated decoding of MPEG-2 video streams using NVIDIA's NVDEC technology. Commonly used for DVD video and some digital television broadcasts. #### Block info Name: `NVMPEG2DecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | MPEG-2 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- MPEG-2 Video Stream --> NVMPEG2DecoderBlock; NVMPEG2DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create NVIDIA MPEG-2 Decoder block var nvMpeg2Decoder = new NVMPEG2DecoderBlock(); // Example: Create a basic file source, demuxer, and renderer var basicFileSource = new BasicFileSourceBlock("test_mpeg2.mpg"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_mpeg2.mpg"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), nvMpeg2Decoder.Input); pipeline.Connect(nvMpeg2Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability Check availability using `NVMPEG2Decoder.IsAvailable()`. Requires an NVIDIA GPU that supports NVDEC for MPEG-2 and appropriate drivers. #### Platforms Windows, Linux (with NVIDIA drivers). ### NVIDIA MPEG-4 Decoder Block (NVMPEG4DecoderBlock) Provides hardware-accelerated decoding of MPEG-4 Part 2 video streams (often found in AVI files, e.g., DivX/Xvid) using NVIDIA's NVDEC technology. Note that this is different from MPEG-4 Part 10 (H.264/AVC). #### Block info Name: `NVMPEG4DecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | MPEG-4 Part 2 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- MPEG-4 Video Stream --> NVMPEG4DecoderBlock; NVMPEG4DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create NVIDIA MPEG-4 Decoder block var nvMpeg4Decoder = new NVMPEG4DecoderBlock(); // Example: Create a basic file source, demuxer, and renderer var basicFileSource = new BasicFileSourceBlock("test_mpeg4.avi"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_mpeg4.avi"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), nvMpeg4Decoder.Input); pipeline.Connect(nvMpeg4Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability Check availability using `NVMPEG4Decoder.IsAvailable()`. Requires an NVIDIA GPU that supports NVDEC for MPEG-4 Part 2 and appropriate drivers. #### Platforms Windows, Linux (with NVIDIA drivers). ### NVIDIA VP8 Decoder Block (NVVP8DecoderBlock) Provides hardware-accelerated decoding of VP8 video streams using NVIDIA's NVDEC technology. VP8 is an open video format, often used with WebM. #### Block info Name: `NVVP8DecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | VP8 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- VP8 Video Stream --> NVVP8DecoderBlock; NVVP8DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create NVIDIA VP8 Decoder block var nvVp8Decoder = new NVVP8DecoderBlock(); // Example: Create a basic file source, demuxer, and renderer var basicFileSource = new BasicFileSourceBlock("test_vp8.webm"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_vp8.webm"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), nvVp8Decoder.Input); pipeline.Connect(nvVp8Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability Check availability using `NVVP8Decoder.IsAvailable()`. Requires an NVIDIA GPU that supports NVDEC for VP8 and appropriate drivers. #### Platforms Windows, Linux (with NVIDIA drivers). ### NVIDIA VP9 Decoder Block (NVVP9DecoderBlock) Provides hardware-accelerated decoding of VP9 video streams using NVIDIA's NVDEC technology. VP9 is an open and royalty-free video coding format developed by Google, often used for web streaming (e.g., YouTube). #### Block info Name: `NVVP9DecoderBlock`. | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | VP9 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- VP9 Video Stream --> NVVP9DecoderBlock; NVVP9DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create NVIDIA VP9 Decoder block var nvVp9Decoder = new NVVP9DecoderBlock(); // Example: Create a basic file source, demuxer, and renderer var basicFileSource = new BasicFileSourceBlock("test_vp9.webm"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_vp9.webm"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // Connect blocks pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), nvVp9Decoder.Input); pipeline.Connect(nvVp9Decoder.Output, videoRenderer.Input); // Start pipeline await pipeline.StartAsync(); ``` #### Availability Check availability using `NVVP9Decoder.IsAvailable()`. Requires an NVIDIA GPU that supports NVDEC for VP9 and appropriate drivers. #### Platforms Windows, Linux (with NVIDIA drivers). ### VAAPI H.264 Decoder Block (VAAPIH264DecoderBlock) Provides hardware-accelerated decoding of H.264 (AVC) video streams using VA-API (Video Acceleration API). Available on Linux systems with compatible hardware and drivers. #### Block info | Pin direction | Media type | Pins count | |---------------|---------------------|------------| | Input video | H.264 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- H.264 Video Stream --> VAAPIH264DecoderBlock; VAAPIH264DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var vaapiH264Decoder = new VAAPIH264DecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_h264.mp4"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_h264.mp4"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), vaapiH264Decoder.Input); pipeline.Connect(vaapiH264Decoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check with `VAAPIH264DecoderBlock.IsAvailable()`. Requires VA-API support and correct SDK redist. #### Platforms Linux (with VA-API drivers). --- ### VAAPI HEVC Decoder Block (VAAPIHEVCDecoderBlock) Provides hardware-accelerated decoding of H.265/HEVC video streams using VA-API. Available on Linux systems with compatible hardware and drivers. #### Block info | Pin direction | Media type | Pins count | |---------------|----------------------|------------| | Input video | H.265/HEVC encoded | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- H.265 Video Stream --> VAAPIHEVCDecoderBlock; VAAPIHEVCDecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var vaapiHevcDecoder = new VAAPIHEVCDecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_hevc.mp4"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_hevc.mp4"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), vaapiHevcDecoder.Input); pipeline.Connect(vaapiHevcDecoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check with `VAAPIHEVCDecoderBlock.IsAvailable()`. Requires VA-API support and correct SDK redist. #### Platforms Linux (with VA-API drivers). --- ### VAAPI JPEG Decoder Block (VAAPIJPEGDecoderBlock) Provides hardware-accelerated decoding of JPEG/MJPEG video streams using VA-API. Available on Linux systems with compatible hardware and drivers. #### Block info | Pin direction | Media type | Pins count | |---------------|--------------------------|------------| | Input video | JPEG encoded video/images | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; HTTPSourceBlock -- MJPEG Stream --> VAAPIJPEGDecoderBlock; VAAPIJPEGDecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var vaapiJpegDecoder = new VAAPIJPEGDecoderBlock(); var httpSettings = new HTTPSourceSettings(new Uri("http://your-mjpeg-camera/stream")); var httpSource = new HTTPSourceBlock(httpSettings); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(httpSource.Output, vaapiJpegDecoder.Input); pipeline.Connect(vaapiJpegDecoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check with `VAAPIJPEGDecoderBlock.IsAvailable()`. Requires VA-API support and correct SDK redist. #### Platforms Linux (with VA-API drivers). --- ### VAAPI VC1 Decoder Block (VAAPIVC1DecoderBlock) Provides hardware-accelerated decoding of VC-1 video streams using VA-API. Available on Linux systems with compatible hardware and drivers. #### Block info | Pin direction | Media type | Pins count | |---------------|---------------------|------------| | Input video | VC-1 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- VC-1 Video Stream --> VAAPIVC1DecoderBlock; VAAPIVC1DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var vaapiVc1Decoder = new VAAPIVC1DecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_vc1.wmv"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_vc1.wmv"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), vaapiVc1Decoder.Input); pipeline.Connect(vaapiVc1Decoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check with `VAAPIVC1DecoderBlock.IsAvailable()`. Requires VA-API support and correct SDK redist. #### Platforms Linux (with VA-API drivers). --- ## Direct3D 11/DXVA Video Decoder Blocks Direct3D 11/DXVA (D3D11) decoder blocks provide hardware-accelerated video decoding on Windows systems with compatible GPUs and drivers. These blocks are useful for high-performance video playback and processing pipelines on Windows. ### D3D11 AV1 Decoder Block Decodes AV1 video streams using Direct3D 11/DXVA hardware acceleration. #### Block info Name: `D3D11AV1DecoderBlock`. | Pin direction | Media type | Pins count | |---------------|---------------------|------------| | Input video | AV1 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- AV1 Video Stream --> D3D11AV1DecoderBlock; D3D11AV1DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create D3D11 AV1 Decoder block var d3d11Av1Decoder = new D3D11AV1DecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_av1.mkv"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_av1.mkv"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), d3d11Av1Decoder.Input); pipeline.Connect(d3d11Av1Decoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check availability using `D3D11AV1DecoderBlock.IsAvailable()`. Requires Windows with D3D11/DXVA support and correct SDK redist. #### Platforms Windows (D3D11/DXVA required). --- ### D3D11 H.264 Decoder Block Decodes H.264 (AVC) video streams using Direct3D 11/DXVA hardware acceleration. #### Block info Name: `D3D11H264DecoderBlock`. | Pin direction | Media type | Pins count | |---------------|---------------------|------------| | Input video | H.264 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- H.264 Video Stream --> D3D11H264DecoderBlock; D3D11H264DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create D3D11 H.264 Decoder block var d3d11H264Decoder = new D3D11H264DecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_h264.mp4"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_h264.mp4"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), d3d11H264Decoder.Input); pipeline.Connect(d3d11H264Decoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check availability using `D3D11H264DecoderBlock.IsAvailable()`. Requires Windows with D3D11/DXVA support and correct SDK redist. #### Platforms Windows (D3D11/DXVA required). --- ### D3D11 H.265 Decoder Block Decodes H.265 (HEVC) video streams using Direct3D 11/DXVA hardware acceleration. #### Block info Name: `D3D11H265DecoderBlock`. | Pin direction | Media type | Pins count | |---------------|----------------------|------------| | Input video | H.265/HEVC encoded | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- H.265 Video Stream --> D3D11H265DecoderBlock; D3D11H265DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create D3D11 H.265 Decoder block var d3d11H265Decoder = new D3D11H265DecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_h265.mp4"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_h265.mp4"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), d3d11H265Decoder.Input); pipeline.Connect(d3d11H265Decoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check availability using `D3D11H265DecoderBlock.IsAvailable()`. Requires Windows with D3D11/DXVA support and correct SDK redist. #### Platforms Windows (D3D11/DXVA required). --- ### D3D11 MPEG-2 Decoder Block Decodes MPEG-2 video streams using Direct3D 11/DXVA hardware acceleration. #### Block info Name: `D3D11MPEG2DecoderBlock`. | Pin direction | Media type | Pins count | |---------------|---------------------|------------| | Input video | MPEG-2 encoded video| 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- MPEG-2 Video Stream --> D3D11MPEG2DecoderBlock; D3D11MPEG2DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create D3D11 MPEG-2 Decoder block var d3d11Mpeg2Decoder = new D3D11MPEG2DecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_mpeg2.mpg"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_mpeg2.mpg"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), d3d11Mpeg2Decoder.Input); pipeline.Connect(d3d11Mpeg2Decoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check availability using `D3D11MPEG2DecoderBlock.IsAvailable()`. Requires Windows with D3D11/DXVA support and correct SDK redist. #### Platforms Windows (D3D11/DXVA required). --- ### D3D11 VP8 Decoder Block Decodes VP8 video streams using Direct3D 11/DXVA hardware acceleration. #### Block info Name: `D3D11VP8DecoderBlock`. | Pin direction | Media type | Pins count | |---------------|---------------------|------------| | Input video | VP8 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- VP8 Video Stream --> D3D11VP8DecoderBlock; D3D11VP8DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create D3D11 VP8 Decoder block var d3d11Vp8Decoder = new D3D11VP8DecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_vp8.webm"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_vp8.webm"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), d3d11Vp8Decoder.Input); pipeline.Connect(d3d11Vp8Decoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check availability using `D3D11VP8DecoderBlock.IsAvailable()`. Requires Windows with D3D11/DXVA support and correct SDK redist. #### Platforms Windows (D3D11/DXVA required). --- ### D3D11 VP9 Decoder Block Decodes VP9 video streams using Direct3D 11/DXVA hardware acceleration. #### Block info Name: `D3D11VP9DecoderBlock`. | Pin direction | Media type | Pins count | |---------------|---------------------|------------| | Input video | VP9 encoded video | 1 | | Output video | Uncompressed video | 1 | #### The sample pipeline ```mermaid graph LR; BasicFileSourceBlock -- Raw Data --> UniversalDemuxBlock; UniversalDemuxBlock -- VP9 Video Stream --> D3D11VP9DecoderBlock; D3D11VP9DecoderBlock -- Decoded Video --> VideoRendererBlock; ``` #### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Create D3D11 VP9 Decoder block var d3d11Vp9Decoder = new D3D11VP9DecoderBlock(); var basicFileSource = new BasicFileSourceBlock("test_vp9.webm"); var mediaInfo = await MediaInfoReader.GetMediaInfoAsync("test_vp9.webm"); if (mediaInfo == null) { Console.WriteLine("Failed to get media info."); return; } var universalDemux = new UniversalDemuxBlock(mediaInfo, renderVideo: true, renderAudio: false); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(basicFileSource.Output, universalDemux.Input); pipeline.Connect(universalDemux.GetVideoOutput(), d3d11Vp9Decoder.Input); pipeline.Connect(d3d11Vp9Decoder.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` #### Availability Check availability using `D3D11VP9DecoderBlock.IsAvailable()`. Requires Windows with D3D11/DXVA support and correct SDK redist. #### Platforms Windows (D3D11/DXVA required). ---END OF PAGE--- # Local File: .\dotnet\mediablocks\VideoEncoders\index.md --- title: Mastering Video Encoders in .NET SDK description: Unlock high-performance video encoding in .NET projects. This guide covers various video encoders, codecs like AV1, H264, HEVC, and GPU acceleration techniques. sidebar_label: Video Encoders order: 18 --- # Video encoding [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) Video encoding is the process of converting raw video data into a compressed format. This process is essential for reducing the size of video files, making them easier to store and stream over the internet. VisioForge Media Blocks SDK provides a wide range of video encoders that support various formats and codecs. For some video encoders, SDK can use GPU acceleration to speed up the encoding process. This feature is especially useful when working with high-resolution video files or when encoding multiple videos simultaneously. NVidia, Intel, and AMD GPUs are supported for hardware acceleration. ## AV1 encoder `AV1 (AOMedia Video 1)`: Developed by the Alliance for Open Media, AV1 is an open, royalty-free video coding format designed for video transmissions over the Internet. It is known for its high compression efficiency and better quality at lower bit rates compared to its predecessors, making it well-suited for high-resolution video streaming applications. Use classes that implement the `IAV1EncoderSettings` interface to set the parameters. #### CPU Encoders ##### AOMAV1EncoderSettings AOM AV1 encoder settings. CPU encoder. **Platforms:** Windows, Linux, macOS. ##### RAV1EEncoderSettings RAV1E AV1 encoder settings. CPU encoder. - **Key Properties**: - `Bitrate` (integer): Target bitrate in kilobits per second. - `LowLatency` (boolean): Enables or disables low latency mode. Default is `false`. - `MaxKeyFrameInterval` (ulong): Maximum interval between keyframes. Default is `240`. - `MinKeyFrameInterval` (ulong): Minimum interval between keyframes. Default is `12`. - `MinQuantizer` (uint): Minimum quantizer value (range 0-255). Default is `0`. - `Quantizer` (uint): Quantizer value (range 0-255). Default is `100`. - `SpeedPreset` (int): Encoding speed preset (10 fastest, 0 slowest). Default is `6`. - `Tune` (`RAV1EEncoderTune`): Tune setting for the encoder. Default is `RAV1EEncoderTune.Psychovisual`. **Platforms:** Windows, Linux, macOS. ###### RAV1EEncoderTune Enum Specifies the tuning option for the RAV1E encoder. - `PSNR` (0): Tune for best PSNR (Peak Signal-to-Noise Ratio). - `Psychovisual` (1): Tune for psychovisual quality. #### GPU Encoders ##### AMFAV1EncoderSettings AMD GPU AV1 video encoder. **Platforms:** Windows, Linux, macOS. ##### NVENCAV1EncoderSettings Nvidia GPU AV1 video encoder. **Platforms:** Windows, Linux, macOS. ##### QSVAV1EncoderSettings Intel GPU AV1 video encoder. **Platforms:** Windows, Linux, macOS. *Note: Intel QSV encoders may also utilize common enumerations like `QSVCodingOption` (`On`, `Off`, `Unknown`) for configuring specific hardware features.* ### Block info Name: AV1EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | AV1 | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AV1EncoderBlock; AV1EncoderBlock-->MP4SinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoEncoderBlock = new AV1EncoderBlock(new QSVAV1EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(videoEncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## DV encoder `DV (Digital Video)`: A format for storing digital video introduced in the 1990s, primarily used in consumer digital camcorders. DV employs intra-frame compression to deliver high-quality video on digital tapes, making it suitable for home videos as well as semi-professional productions. ### Block info Name: DVEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | video/x-dv | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->DVEncoderBlock; DVEncoderBlock-->AVISinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoEncoderBlock = new DVEncoderBlock(new DVVideoEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var sinkBlock = new AVISinkBlock(new AVISinkSettings(@"output.avi")); pipeline.Connect(videoEncoderBlock.Output, sinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## H264 encoder The H264 encoder block is used for encoding files in MP4, MKV, and some other formats, as well as for network streaming using RTSP and HLS. Use classes that implement the IH264EncoderSettings interface to set the parameters. ### Settings #### NVENCH264EncoderSettings Nvidia GPUs H264 video encoder. **Platforms:** Windows, Linux, macOS. #### AMFHEVCEncoderSettings AMD/ATI GPUs H264 video encoder. **Platforms:** Windows, Linux, macOS. #### QSVH264EncoderSettings Intel GPU H264 video encoder. **Platforms:** Windows, Linux, macOS. #### OpenH264EncoderSettings Software CPU H264 encoder. **Platforms:** Windows, macOS, Linux, iOS, Android. #### CustomH264EncoderSettings Allows using a custom GStreamer element for H264 encoding. You can specify the GStreamer element name and configure its properties. **Platforms:** Windows, Linux, macOS. ### Block info Name: H264EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | H264 | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->H264EncoderBlock; H264EncoderBlock-->MP4SinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var h264EncoderBlock = new H264EncoderBlock(new NVENCH264EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, h264EncoderBlock.Input); var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(h264EncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` #### Sample applications - [Simple Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Simple%20Capture%20Demo) - [Screen Capture Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Blocks%20SDK/WPF/CSharp/Screen%20Capture) ### Platforms Windows, macOS, Linux, iOS, Android. ## HEVC/H265 encoder HEVC encoder is used for encoding files in MP4, MKV, and some other formats, as well as for network streaming using RTSP and HLS. Use classes that implement the IHEVCEncoderSettings interface to set the parameters. ### Settings #### MFHEVCEncoderSettings Microsoft Media Foundation HEVC encoder. CPU encoder. **Platforms:** Windows. #### NVENCHEVCEncoderSettings Nvidia GPUs HEVC video encoder. **Platforms:** Windows, Linux, macOS. #### AMFHEVCEncoderSettings AMD/ATI GPUs HEVC video encoder. **Platforms:** Windows, Linux, macOS. #### QSVHEVCEncoderSettings Intel GPU HEVC video encoder. **Platforms:** Windows, Linux, macOS. #### CustomHEVCEncoderSettings Allows using a custom GStreamer element for HEVC encoding. You can specify the GStreamer element name and configure its properties. **Platforms:** Windows, Linux, macOS. ### Block info Name: HEVCEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | HEVC | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->HEVCEncoderBlock; HEVCEncoderBlock-->MP4SinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var hevcEncoderBlock = new HEVCEncoderBlock(new NVENCHEVCEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, hevcEncoderBlock.Input); var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output.mp4")); pipeline.Connect(hevcEncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## MJPEG encoder `MJPEG (Motion JPEG)`: A video compression format where each frame of video is separately compressed into a JPEG image. This technique is straightforward and results in no interframe compression, making it ideal for situations where frame-specific editing or access is required, such as in surveillance and medical imaging. Use classes that implement the IH264EncoderSettings interface to set the parameters. ### Settings #### MJPEGEncoderSettings Default MJPEG encoder. CPU encoder. - **Key Properties**: - `Quality` (int): JPEG quality level (10-100). Default is `85`. - **Encoder Type**: `MJPEGEncoderType.CPU`. **Platforms:** Windows, Linux, macOS, iOS, Android. #### QSVMJPEGEncoderSettings Intel GPUs MJPEG encoder. - **Key Properties**: - `Quality` (uint): JPEG quality level (10-100). Default is `85`. - **Encoder Type**: `MJPEGEncoderType.GPU_Intel_QSV_MJPEG`. **Platforms:** Windows, Linux, macOS. #### MJPEGEncoderType Enum Specifies the type of MJPEG encoder. - `CPU`: Default CPU-based encoder. - `GPU_Intel_QSV_MJPEG`: Intel QuickSync GPU-based MJPEG encoder. ### Block info Name: MJPEGEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | MJPEG | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MJPEGEncoderBlock; MJPEGEncoderBlock-->AVISinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoEncoderBlock = new MJPEGEncoderBlock(new MJPEGEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, videoEncoderBlock.Input); var aviSinkBlock = new AVISinkBlock(new AVISinkSettings(@"output.avi")); pipeline.Connect(videoEncoderBlock.Output, aviSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Theora encoder The [Theora](https://www.theora.org/) encoder is used to encode video files in WebM format. ### Settings #### TheoraEncoderSettings Provides settings for the Theora encoder. - **Key Properties**: - `Bitrate` (kbps) - `CapOverflow`, `CapUnderflow` (bit reservoir capping) - `DropFrames` (allow/disallow frame dropping) - `KeyFrameAuto` (automatic keyframe detection) - `KeyFrameForce` (interval to force keyframe every N frames) - `KeyFrameFrequency` (keyframe frequency) - `MultipassCacheFile` (string path for multipass cache) - `MultipassMode` (using `TheoraMultipassMode` enum: `SinglePass`, `FirstPass`, `SecondPass`) - `Quality` (integer value, typically 0-63 for libtheora, meaning can vary) - `RateBuffer` (size of rate control buffer in units of frames, 0 = auto) - `SpeedLevel` (amount of motion vector searching, 0-2 or higher depending on implementation) - `VP3Compatible` (boolean to enable VP3 compatibility) - **Availability**: Can be checked using `TheoraEncoderSettings.IsAvailable()`. ### Block info Name: TheoraEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | video/x-theora | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TheoraEncoderBlock; TheoraEncoderBlock-->WebMSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var theoraEncoderBlock = new TheoraEncoderBlock(new TheoraEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, theoraEncoderBlock.Input); var webmSinkBlock = new WebMSinkBlock(new WebMSinkSettings(@"output.webm")); pipeline.Connect(theoraEncoderBlock.Output, webmSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## VPX encoder VPX encoder block is used to encode files in WebM, MKV, or OGG files. VPX encoder is a set of video codecs for encoding in VP8 and VP9 formats. The VPX encoder block utilizes settings classes that implement the `IVPXEncoderSettings` interface. Key settings classes include: ### Settings The common base class for VP8 and VP9 CPU encoder settings is `VPXEncoderSettings`. It provides a wide range of shared properties for tuning the encoding process, such as: - `ARNRMaxFrames`, `ARNRStrength`, `ARNRType` (AltRef noise reduction) - `BufferInitialSize`, `BufferOptimalSize`, `BufferSize` (client buffer settings) - `CPUUsed`, `CQLevel` (constrained quality) - `Deadline` (encoding deadline per frame) - `DropframeThreshold` - `RateControl` (using `VPXRateControl` enum) - `ErrorResilient` (using `VPXErrorResilientFlags` enum) - `HorizontalScalingMode`, `VerticalScalingMode` (using `VPXScalingMode` enum) - `KeyFrameMaxDistance`, `KeyFrameMode` (using `VPXKeyFrameMode` enum) - `MinQuantizer`, `MaxQuantizer` - `MultipassCacheFile`, `MultipassMode` (using `VPXMultipassMode` enum) - `NoiseSensitivity` - `TargetBitrate` (in Kbits/s) - `NumOfThreads` - `TokenPartitions` (using `VPXTokenPartitions` enum) - `Tuning` (using `VPXTuning` enum) #### VP8EncoderSettings CPU encoder for VP8. Inherits from `VPXEncoderSettings`. - **Key Properties**: Leverages properties from `VPXEncoderSettings` tailored for VP8. - **Encoder Type**: `VPXEncoderType.VP8`. - **Availability**: Can be checked using `VP8EncoderSettings.IsAvailable()`. #### VP9EncoderSettings CPU encoder for VP9. Inherits from `VPXEncoderSettings`. - **Key Properties**: In addition to `VPXEncoderSettings` properties, includes VP9-specific settings: - `AQMode` (Adaptive Quantization mode, using `VPXAdaptiveQuantizationMode` enum) - `FrameParallelDecoding` (allow parallel processing) - `RowMultithread` (multi-threaded row encoding) - `TileColumns`, `TileRows` (log2 values) - **Encoder Type**: `VPXEncoderType.VP9`. - **Availability**: Can be checked using `VP9EncoderSettings.IsAvailable()`. #### QSVVP9EncoderSettings Intel QSV (GPU accelerated) encoder for VP9. - **Key Properties**: - `LowLatency` - `TargetUsage` (1: Best quality, 4: Balanced, 7: Best speed) - `Bitrate` (Kbit/sec) - `GOPSize` - `ICQQuality` (Intelligent Constant Quality) - `MaxBitrate` (Kbit/sec) - `QPI`, `QPP` (constant quantizer for I and P frames) - `Profile` (0-3) - `RateControl` (using `QSVVP9EncRateControl` enum) - `RefFrames` - **Encoder Type**: `VPXEncoderType.QSV_VP9`. - **Availability**: Can be checked using `QSVVP9EncoderSettings.IsAvailable()`. #### CustomVPXEncoderSettings Allows using a custom GStreamer element for VPX encoding. - **Key Properties**: - `ElementName` (string to specify the GStreamer element name) - `Properties` (Dictionary to configure the element) - `VideoFormat` (required video format like `VideoFormatX.NV12`) - **Encoder Type**: `VPXEncoderType.CustomEncoder`. ### VPX Enumerations Several enumerations are available to configure VPX encoders: - `VPXAdaptiveQuantizationMode`: Defines adaptive quantization modes (e.g., `Off`, `Variance`, `Complexity`, `CyclicRefresh`, `Equator360`, `Perceptual`, `PSNR`, `Lookahead`). - `VPXErrorResilientFlags`: Flags for error resilience features (e.g., `None`, `Default`, `Partitions`). - `VPXKeyFrameMode`: Defines keyframe placement strategies (e.g., `Auto`, `Disabled`). - `VPXMultipassMode`: Modes for multipass encoding (e.g., `OnePass`, `FirstPass`, `LastPass`). - `VPXRateControl`: Rate control modes (e.g., `VBR`, `CBR`, `CQ`). - `VPXScalingMode`: Scaling modes (e.g., `Normal`, `_4_5`, `_3_5`, `_1_2`). - `VPXTokenPartitions`: Number of token partitions (e.g., `One`, `Two`, `Four`, `Eight`). - `VPXTuning`: Tuning options for the encoder (e.g., `PSNR`, `SSIM`). - `VPXEncoderType`: Specifies the VPX encoder variant (e.g., `VP8`, `VP9`, `QSV_VP9`, `CustomEncoder`, and platform-specific ones like `OMXExynosVP8Encoder`). - `QSVVP9EncRateControl`: Rate control modes specific to `QSVVP9EncoderSettings` (e.g., `CBR`, `VBR`, `CQP`, `ICQ`). ### Block info Name: VPXEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | VP8/VP9 | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VPXEncoderBlock; VPXEncoderBlock-->WebMSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vp8EncoderBlock = new VPXEncoderBlock(new VP8EncoderSettings()); pipeline.Connect(fileSource.VideoOutput, vp8EncoderBlock.Input); var webmSinkBlock = new WebMSinkBlock(new WebMSinkSettings(@"output.webm")); pipeline.Connect(vp8EncoderBlock.Output, webmSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## MPEG2 encoder `MPEG-2`: A widely used standard for video and audio compression, commonly found in DVDs, digital television broadcasts (like DVB and ATSC), and SVCDs. It offers good quality at relatively low bitrates for standard definition content. ### Block info Name: MPEG2EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | video/mpeg | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MPEG2EncoderBlock; MPEG2EncoderBlock-->MPEGTSSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mpeg2EncoderBlock = new MPEG2EncoderBlock(new MPEG2VideoEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, mpeg2EncoderBlock.Input); // Example: Using an MPGSinkBlock for .mpg or .ts files var mpgSinkBlock = new MPGSinkBlock(new MPGSinkSettings(@"output.mpg")); pipeline.Connect(mpeg2EncoderBlock.Output, mpgSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ## MPEG4 encoder `MPEG-4 Part 2 Visual` (often referred to simply as MPEG-4 video) is a video compression standard that is part of the MPEG-4 suite. It is used in various applications, including streaming video, video conferencing, and optical discs like DivX and Xvid. ### Block info Name: MPEG4EncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | video/mpeg, mpegversion=4 | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MPEG4EncoderBlock; MPEG4EncoderBlock-->MP4SinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; // Input file var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mpeg4EncoderBlock = new MPEG4EncoderBlock(new MPEG4VideoEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, mpeg4EncoderBlock.Input); // Example: Using an MP4SinkBlock for .mp4 files var mp4SinkBlock = new MP4SinkBlock(new MP4SinkSettings(@"output_mpeg4.mp4")); pipeline.Connect(mpeg4EncoderBlock.Output, mp4SinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ## Apple ProRes encoder `Apple ProRes`: A high-quality, lossy video compression format developed by Apple Inc., widely used in professional video production and post-production workflows for its excellent balance of image quality and performance. ### Block info Name: AppleProResEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | ProRes | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->AppleProResEncoderBlock; AppleProResEncoderBlock-->MOVSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var proResEncoderBlock = new AppleProResEncoderBlock(new AppleProResEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, proResEncoderBlock.Input); var movSinkBlock = new MOVSinkBlock(new MOVSinkSettings(@"output.mov")); pipeline.Connect(proResEncoderBlock.Output, movSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms macOS, iOS. ### Availability You can check if the Apple ProRes encoder is available in your environment using: ```csharp bool available = AppleProResEncoderBlock.IsAvailable(); ``` ## WMV encoder ### Overview WMV encoder block encodes video in WMV format. ### Block info Name: WMVEncoderBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | video/x-wmv | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WMVEncoderBlock; WMVEncoderBlock-->ASFSinkBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(false); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wmvEncoderBlock = new WMVEncoderBlock(new WMVEncoderSettings()); pipeline.Connect(fileSource.VideoOutput, wmvEncoderBlock.Input); var asfSinkBlock = new ASFSinkBlock(new ASFSinkSettings(@"output.wmv")); pipeline.Connect(wmvEncoderBlock.Output, asfSinkBlock.CreateNewInput(MediaBlockPadMediaType.Video)); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux. ## General Video Settings Considerations While specific encoder settings classes provide detailed control, some general concepts or enumerations might be relevant across different encoders or for understanding video quality options. ---END OF PAGE--- # Local File: .\dotnet\mediablocks\VideoProcessing\index.md --- title: Video Processing & Effects Blocks for .Net description: Discover a wide array of video processing and visual effects blocks available in the Media Blocks SDK for .Net. Learn how to implement color adjustments, deinterlacing, image/text overlays, geometric transformations, and many other real-time video enhancements in your .Net applications. sidebar_label: Video Processing and Effects --- # Video processing blocks [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Table of Contents - [Color effects](#color-effects) - [Deinterlace](#deinterlace) - [Fish eye](#fish-eye) - [Flip/Rotate](#fliprotate) - [Gamma](#gamma) - [Gaussian blur](#gaussian-blur) - [Image overlay](#image-overlay) - [Mirror](#mirror) - [Perspective](#perspective) - [Pinch](#pinch) - [Resize](#resize) - [Rotate](#rotate) - [Video sample grabber](#video-sample-grabber) - [Sphere](#sphere) - [Square](#square) - [Stretch](#stretch) - [Text overlay](#text-overlay) - [Tunnel](#tunnel) - [Twirl](#twirl) - [Video balance](#video-balance) - [Video mixer](#video-mixer) - [Water ripple](#water-ripple) - [D3D11 Video Converter](#d3d11-video-converter) - [Video Effects (Windows)](#video-effects-windows) - [D3D11 Video Compositor](#d3d11-video-compositor) - [VR360 Processor](#vr360-processor) ## Color effects [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block performs basic video frame color processing: fake heat camera toning, sepia toning, invert and slightly shade to blue, cross processing toning, and yellow foreground/blue background color filter. ### Block info Name: ColorEffectsBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ColorEffectsBlock; ColorEffectsBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // Sepia var colorEffects = new ColorEffectsBlock(ColorEffectsPreset.Sepia); pipeline.Connect(fileSource.VideoOutput, colorEffects.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(colorEffects.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Deinterlace [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block deinterlaces interlaced video frames into progressive video frames. Several methods of processing are available. Use the DeinterlaceSettings class to configure the block. ### Block info Name: DeinterlaceBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->DeinterlaceBlock; DeinterlaceBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var deinterlace = new DeinterlaceBlock(new DeinterlaceSettings()); pipeline.Connect(fileSource.VideoOutput, deinterlace.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(deinterlace.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Fish eye [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The fisheye block simulates a fisheye lens by zooming on the center of the image and compressing the edges. ### Block info Name: FishEyeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->FishEyeBlock; FishEyeBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var fishEye = new FishEyeBlock(); pipeline.Connect(fileSource.VideoOutput, fishEye.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(fishEye.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Flip/Rotate [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block flips and rotates the video stream. Use the VideoFlipRotateMethod enumeration to configure. ### Block info Name: FlipRotateBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->FlipRotateBlock; FlipRotateBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // 90 degree rotation var flipRotate = new FlipRotateBlock(VideoFlipRotateMethod.Method90R); pipeline.Connect(fileSource.VideoOutput, flipRotate.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(flipRotate.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Gamma [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block performs gamma correction on a video stream. ### Block info Name: GammaBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->GammaBlock; GammaBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var gamma = new GammaBlock(2.0); pipeline.Connect(fileSource.VideoOutput, gamma.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(gamma.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Gaussian blur [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block blurs the video stream using the Gaussian function. ### Block info Name: GaussianBlurBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->GaussianBlurBlock; GaussianBlurBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var gaussianBlur = new GaussianBlurBlock(); pipeline.Connect(fileSource.VideoOutput, gaussianBlur.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(gaussianBlur.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Image overlay [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block overlays an image loaded from a file onto a video stream. You can set an image position and optional alpha value. 32-bit images with alpha-channel are supported. ### Block info Name: ImageOverlayBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->ImageOverlayBlock; ImageOverlayBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var imageOverlay = new ImageOverlayBlock(@"logo.png"); pipeline.Connect(fileSource.VideoOutput, imageOverlay.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(imageOverlay.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Mirror [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The mirror block splits the image into two halves and reflects one over the other. ### Block info Name: MirrorBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->MirrorBlock; MirrorBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var mirrorBlock = new MirrorBlock(MirrorMode.Top); pipeline.Connect(fileSource.VideoOutput, mirrorBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(mirrorBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Perspective [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The perspective block applies a 2D perspective transform. ### Block info Name: PerspectiveBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->PerspectiveBlock; PerspectiveBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var persBlock = new PerspectiveBlock(new int[] { 1, 2, 3, 4, 5, 6, 7, 8, 9 }); pipeline.Connect(fileSource.VideoOutput, persBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(persBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Pinch [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block performs the pinch geometric transform of the image. ### Block info Name: PinchBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->PinchBlock; PinchBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var pinchBlock = new PinchBlock(); pipeline.Connect(fileSource.VideoOutput, pinchBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(pinchBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Rotate [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block rotates the image by a specified angle. ### Block info Name: RotateBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->RotateBlock; RotateBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var rotateBlock = new RotateBlock(0.7); pipeline.Connect(fileSource.VideoOutput, rotateBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(rotateBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Resize [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block resizes the video stream. You can configure the resize method, the letterbox flag, and many other options. Use the `ResizeVideoEffect` class to configure. ### Block info Name: VideoResizeBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoResizeBlock; VideoResizeBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoResize = new VideoResizeBlock(new ResizeVideoEffect(1280, 720) { Letterbox = false }); pipeline.Connect(fileSource.VideoOutput, videoResize.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoResize.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Video sample grabber [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The video sample grabber calls an event for each video frame. You can save or process the received video frame. ### Block info Name: VideoSampleGrabberBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoSampleGrabberBlock; VideoSampleGrabberBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoSG = new VideoSampleGrabberBlock(); videoSG.OnVideoFrameBuffer += VideoSG_OnVideoFrameBuffer; pipeline.Connect(fileSource.VideoOutput, videoSG.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoSG.Output, videoRenderer.Input); await pipeline.StartAsync(); private void VideoSG_OnVideoFrameBuffer(object sender, VideoFrameBufferEventArgs e) { // save or process the video frame } ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Sphere [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The sphere block applies a sphere geometric transform to the video. ### Block info Name: SphereBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SphereBlock; SphereBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var sphereBlock = new SphereBlock(); pipeline.Connect(fileSource.VideoOutput, sphereBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(sphereBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Square [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The square block distorts the center part of the video into a square. ### Block info Name: SquareBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->SquareBlock; SquareBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var squareBlock = new SquareBlock(new SquareVideoEffect()); pipeline.Connect(fileSource.VideoOutput, squareBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(squareBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Stretch [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The stretch block stretches the video in the circle around the center point. ### Block info Name: StretchBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->StretchBlock; StretchBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var stretchBlock = new StretchBlock(); pipeline.Connect(fileSource.VideoOutput, stretchBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(stretchBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Text overlay [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block adds the text overlay on top of the video stream. ### Block info Name: TextOverlayBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TextOverlayBlock; TextOverlayBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var textOverlay = new TextOverlayBlock(new TextOverlaySettings("Hello world!")); pipeline.Connect(fileSource.VideoOutput, textOverlay.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(textOverlay.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Tunnel [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block applies a light tunnel effect to a video stream. ### Block info Name: TunnelBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TunnelBlock; TunnelBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var tunnelBlock = new TunnelBlock(); pipeline.Connect(fileSource.VideoOutput, tunnelBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(tunnelBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Twirl [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The twirl block twists the video frame from the center out. ### Block info Name: TwirlBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->TwirlBlock; TwirlBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var twirlBlock = new TwirlBlock(); pipeline.Connect(fileSource.VideoOutput, twirlBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(twirlBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Video balance [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The block processes the video stream and allows you to change brightness, contrast, hue, and saturation. Use the VideoBalanceVideoEffect class to configure the block settings. ### Block info Name: VideoBalanceBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoBalanceBlock; VideoBalanceBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoBalance = new VideoBalanceBlock(new VideoBalanceVideoEffect() { Brightness = 0.25 }); pipeline.Connect(fileSource.VideoOutput, videoBalance.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoBalance.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## Video mixer [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The video mixer block has several inputs and one output. The block draws the inputs in the selected order at the selected positions. You can also set the desired level of transparency for each stream. ### Block info Name: VideoMixerBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 or more Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock#1-->VideoMixerBlock; UniversalSourceBlock#2-->VideoMixerBlock; VideoMixerBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); // Define source files var filename1 = "test.mp4"; // Replace with your first video file var fileSource1 = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename1))); var filename2 = "test2.mp4"; // Replace with your second video file var fileSource2 = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename2))); // Configure VideoMixerSettings with output resolution and frame rate // For example, 1280x720 resolution at 30 frames per second var outputWidth = 1280; var outputHeight = 720; var outputFrameRate = new VideoFrameRate(30); var mixerSettings = new VideoMixerSettings(outputWidth, outputHeight, outputFrameRate); // Add streams to the mixer // Stream 1: Main video, occupies the full output frame, Z-order 0 (bottom layer) mixerSettings.AddStream(new VideoMixerStream(new Rect(0, 0, outputWidth, outputHeight), 0)); // Stream 2: Overlay video, smaller rectangle, positioned at (50,50), Z-order 1 (on top) // Rectangle: left=50, top=50, width=320, height=180 mixerSettings.AddStream(new VideoMixerStream(new Rect(50, 50, 320, 180), 1)); // Create the VideoMixerBlock var videoMixer = new VideoMixerBlock(mixerSettings); // Connect source outputs to VideoMixerBlock inputs pipeline.Connect(fileSource1.VideoOutput, videoMixer.Inputs[0]); pipeline.Connect(fileSource2.VideoOutput, videoMixer.Inputs[1]); // Create a VideoRendererBlock to display the mixed video // VideoView1 is a placeholder for your UI element (e.g., a WPF control) var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoMixer.Output, videoRenderer.Input); // Start the pipeline await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ### Video Mixer Types and Configuration The Media Blocks SDK offers several types of video mixers, allowing you to choose the best fit for your application's performance needs and target platform capabilities. These include CPU-based, Direct3D 11, and OpenGL mixers. All mixer settings classes inherit from `VideoMixerBaseSettings`, which defines common properties like output resolution (`Width`, `Height`), `FrameRate`, and the list of `Streams` to be mixed. #### 1. CPU-based Video Mixer (VideoMixerSettings) This is the default video mixer and relies on CPU processing for mixing video streams. It is platform-agnostic and a good general-purpose option. To use the CPU-based mixer, you instantiate `VideoMixerSettings`: ```csharp // Output resolution 1920x1080 at 30 FPS var outputWidth = 1920; var outputHeight = 1080; var outputFrameRate = new VideoFrameRate(30); var mixerSettings = new VideoMixerSettings(outputWidth, outputHeight, outputFrameRate); // Add streams (see example in the main Video Mixer section) // mixerSettings.AddStream(new VideoMixerStream(new Rect(0, 0, outputWidth, outputHeight), 0)); // ... var videoMixer = new VideoMixerBlock(mixerSettings); ``` #### 2. Direct3D 11 Video Compositor (D3D11VideoCompositorSettings) For Windows applications, the `D3D11VideoCompositorSettings` provides hardware-accelerated video mixing using Direct3D 11. This can offer significant performance improvements, especially with high-resolution video or a large number of streams. ```csharp // Output resolution 1920x1080 at 30 FPS var outputWidth = 1920; var outputHeight = 1080; var outputFrameRate = new VideoFrameRate(30); // Optionally, specify the graphics adapter index (-1 for default) var adapterIndex = -1; var d3dMixerSettings = new D3D11VideoCompositorSettings(outputWidth, outputHeight, outputFrameRate) { AdapterIndex = adapterIndex }; // Streams are added similarly to VideoMixerSettings // d3dMixerSettings.AddStream(new VideoMixerStream(new Rect(0, 0, outputWidth, outputHeight), 0)); // For more advanced control, you can use D3D11VideoCompositorStream to specify blend states // d3dMixerSettings.AddStream(new D3D11VideoCompositorStream(new Rect(50, 50, 320, 180), 1) // { // BlendSourceRGB = D3D11CompositorBlend.SourceAlpha, // BlendDestRGB = D3D11CompositorBlend.InverseSourceAlpha // }); // ... var videoMixer = new VideoMixerBlock(d3dMixerSettings); ``` The `D3D11VideoCompositorStream` class, which inherits from `VideoMixerStream`, allows for fine-grained control over D3D11 blend states if needed. #### 3. OpenGL Video Mixer (GLVideoMixerSettings) The `GLVideoMixerSettings` enables hardware-accelerated video mixing using OpenGL. This is a cross-platform solution for leveraging GPU capabilities on Windows, macOS, and Linux. ```csharp // Output resolution 1920x1080 at 30 FPS var outputWidth = 1920; var outputHeight = 1080; var outputFrameRate = new VideoFrameRate(30); var glMixerSettings = new GLVideoMixerSettings(outputWidth, outputHeight, outputFrameRate); // Streams are added similarly to VideoMixerSettings // glMixerSettings.AddStream(new VideoMixerStream(new Rect(0, 0, outputWidth, outputHeight), 0)); // For more advanced control, you can use GLVideoMixerStream to specify blend functions and equations // glMixerSettings.AddStream(new GLVideoMixerStream(new Rect(50, 50, 320, 180), 1) // { // BlendFunctionSourceRGB = GLVideoMixerBlendFunction.SourceAlpha, // BlendFunctionDesctinationRGB = GLVideoMixerBlendFunction.OneMinusSourceAlpha, // BlendEquationRGB = GLVideoMixerBlendEquation.Add // }); // ... var videoMixer = new VideoMixerBlock(glMixerSettings); ``` The `GLVideoMixerStream` class, inheriting from `VideoMixerStream`, provides properties to control OpenGL-specific blending parameters. Choosing the appropriate mixer depends on your application's requirements. For simple mixing or maximum compatibility, the CPU-based mixer is suitable. For performance-critical applications on Windows, D3D11 is recommended. For cross-platform GPU acceleration, OpenGL is the preferred choice. ## Water ripple [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The water ripple block creates a water ripple effect on the video stream. Use the `WaterRippleVideoEffect` class to configure. ### Block info Name: WaterRippleBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->WaterRippleBlock; WaterRippleBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var wrBlock = new WaterRippleBlock(new WaterRippleVideoEffect()); pipeline.Connect(fileSource.VideoOutput, wrBlock.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(wrBlock.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows, macOS, Linux, iOS, Android. ## D3D11 Video Converter [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The D3D11 Video Converter block performs hardware-accelerated video format conversion using Direct3D 11. This is useful for efficient color space or format changes on Windows platforms. ### Block info Name: D3D11VideoConverterBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->D3D11VideoConverterBlock; D3D11VideoConverterBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var d3d11Converter = new D3D11VideoConverterBlock(); pipeline.Connect(fileSource.VideoOutput, d3d11Converter.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(d3d11Converter.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows (Direct3D 11 required). ## Video Effects (Windows) [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The Video Effects (Windows) block allows you to add, update, and manage multiple video effects in real time. This block is specific to Windows and leverages the Media Foundation pipeline for effects processing. ### Block info Name: VideoEffectsWinBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VideoEffectsWinBlock; VideoEffectsWinBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var videoEffects = new VideoEffectsWinBlock(); // Example: add a brightness effect videoEffects.Video_Effects_Add(new VideoEffectBrightness(true, 0.2)); pipeline.Connect(fileSource.VideoOutput, videoEffects.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(videoEffects.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows. ## D3D11 Video Compositor [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The D3D11 Video Compositor block provides hardware-accelerated video mixing and compositing using Direct3D 11. It is designed for high-performance multi-stream video composition on Windows. ### Block info Name: D3D11VideoCompositorBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 or more Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock#1-->D3D11VideoCompositorBlock; UniversalSourceBlock#2-->D3D11VideoCompositorBlock; D3D11VideoCompositorBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename1 = "test.mp4"; var fileSource1 = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename1))); var filename2 = "test2.mp4"; var fileSource2 = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename2))); var outputWidth = 1280; var outputHeight = 720; var outputFrameRate = new VideoFrameRate(30); var settings = new D3D11VideoCompositorSettings(outputWidth, outputHeight, outputFrameRate); settings.AddStream(new D3D11VideoCompositorStream(new Rect(0, 0, outputWidth, outputHeight), 0)); settings.AddStream(new D3D11VideoCompositorStream(new Rect(50, 50, 320, 180), 1)); var d3d11Compositor = new D3D11VideoCompositorBlock(settings); pipeline.Connect(fileSource1.VideoOutput, d3d11Compositor.Inputs[0]); pipeline.Connect(fileSource2.VideoOutput, d3d11Compositor.Inputs[1]); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(d3d11Compositor.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows (Direct3D 11 required). ## VR360 Processor [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) The VR360 Processor block applies 360-degree equirectangular video effects, suitable for VR content. It uses Direct3D 11 for GPU-accelerated processing and allows real-time adjustment of yaw, pitch, roll, and field of view. ### Block info Name: VR360ProcessorBlock. Pin direction | Media type | Pins count --- | :---: | :---: Input | Uncompressed video | 1 Output | Uncompressed video | 1 ### The sample pipeline ```mermaid graph LR; UniversalSourceBlock-->VR360ProcessorBlock; VR360ProcessorBlock-->VideoRendererBlock; ``` ### Sample code ```csharp var pipeline = new MediaBlocksPipeline(); var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); var vr360Settings = new D3D11VR360RendererSettings { Yaw = 0, Pitch = 0, Roll = 0, FOV = 90 }; var vr360Processor = new VR360ProcessorBlock(vr360Settings); pipeline.Connect(fileSource.VideoOutput, vr360Processor.Input); var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); pipeline.Connect(vr360Processor.Output, videoRenderer.Input); await pipeline.StartAsync(); ``` ### Platforms Windows (Direct3D 11 required). ---END OF PAGE--- # Local File: .\dotnet\mediablocks\VideoRendering\index.md --- title: Media Streaming Video Renderer Block SDK description: Display video streams on multiple platforms (Windows, macOS, Linux, iOS, Android) with DirectX, OpenGL, and Metal support using our Video Renderer Block SDK. sidebar_label: Video Renderer --- # Video Renderer Block [!badge size="xl" target="blank" variant="info" text="Media Blocks SDK .Net"](https://www.visioforge.com/media-blocks-sdk-net) ## Overview The Video Renderer block is an essential component designed for developers who need to display video streams in their applications. This powerful tool enables you to render video content on specific areas of windows or screens across various platforms and UI frameworks. The block utilizes a platform-specific visual control called `VideoView` which leverages DirectX technology on Windows systems and typically implements OpenGL rendering on other platforms. The SDK fully supports cross-platform development with compatibility for both Avalonia and MAUI UI frameworks. One of the key advantages of this block is its flexibility - developers can implement multiple video views and renderers to display the same video stream in different locations simultaneously, whether in separate sections of a window or across multiple windows. ## Rendering Technologies ### DirectX Integration On Windows platforms, the Video Renderer Block seamlessly integrates with DirectX for high-performance hardware-accelerated rendering. This integration provides several benefits: - **Hardware acceleration**: Utilizes the GPU for efficient video processing and rendering - **Low-latency playback**: Minimizes delay between frame processing and display - **Direct3D surface sharing**: Enables efficient memory management and reduced copying of video data - **Multiple display support**: Handles rendering across various display configurations - **Support for High DPI**: Ensures crisp rendering on high-resolution displays The renderer automatically selects the appropriate DirectX version based on your system capabilities, supporting DirectX 11 and DirectX 12 where available. ### OpenGL Implementation For cross-platform compatibility, the Video Renderer uses OpenGL on Linux and older macOS systems: - **Consistent rendering API**: Provides a unified approach across different operating systems - **Shader-based processing**: Enables advanced video effects and color transformations - **Texture mapping optimization**: Efficiently handles video frame presentation - **Framebuffer objects support**: Allows for off-screen rendering and complex composition - **Hardware-accelerated scaling**: Delivers high-quality resizing with minimal performance impact OpenGL ES variants are utilized on mobile platforms to ensure optimal performance while maintaining compatibility with the core rendering pipeline. ### Metal Framework Support On newer Apple platforms (macOS, iOS, iPadOS), the Video Renderer can leverage Metal - Apple's modern graphics and compute API: - **Native Apple integration**: Optimized specifically for Apple hardware - **Reduced CPU overhead**: Minimizes processing bottlenecks compared to OpenGL - **Enhanced parallel execution**: Better utilizes multi-core processors - **Improved memory bandwidth**: More efficient video frame handling - **Integration with Apple's video toolchain**: Seamless interoperability with AV Foundation and Core Video The renderer automatically selects Metal when available on Apple platforms, falling back to OpenGL when necessary on older versions. ## Technical Specifications ### Block Information Name: VideoRendererBlock | Pin direction | Media type | Pins count | | --- | :---: | :---: | | Input video | uncompressed video | one or more | ## Implementation Guide ### Setting Up Your Video View The Video View component serves as the visual element where your video content will be displayed. It needs to be properly integrated into your application's UI layout. ### Creating a Basic Pipeline Below is a visual representation of a simple pipeline implementation: ```mermaid graph LR; UniversalSourceBlock-->VideoRendererBlock; ``` This diagram illustrates how a source block connects directly to the video renderer to create a functional video playback system. ### Code Implementation Example The following sample demonstrates how to implement a basic video rendering pipeline: ```csharp // Create a pipeline var pipeline = new MediaBlocksPipeline(); // create a source block var filename = "test.mp4"; var fileSource = new UniversalSourceBlock(await UniversalSourceSettings.CreateAsync(new Uri(filename))); // create a video renderer block var videoRenderer = new VideoRendererBlock(pipeline, VideoView1); // connect the blocks pipeline.Connect(fileSource.VideoOutput, videoRenderer.Input); // start the pipeline await pipeline.StartAsync(); ``` ## Platform Compatibility The Video Renderer block offers wide compatibility across multiple operating systems and devices: - Windows - macOS - Linux - iOS - Android This makes it an ideal solution for developers building cross-platform applications that require consistent video rendering capabilities. ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\deployment.md --- title: Media Player SDK .Net Deployment Guide description: Step-by-step deployment instructions for Media Player SDK .Net applications. Learn how to deploy using NuGet packages, silent installers, and manual configuration. Includes runtime dependencies, DirectShow filters, and environment setup for Windows and cross-platform development. sidebar_label: Deployment Guide --- # Media Player SDK .Net Deployment Guide [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) This comprehensive guide covers all deployment scenarios for the Media Player SDK .Net, ensuring your applications work correctly across different environments. Whether you're developing cross-platform applications or Windows-specific solutions, this guide provides the necessary steps for successful deployment. ## Engine Types Overview The Media Player SDK .Net offers two primary engine types, each designed for specific deployment scenarios: ### MediaPlayerCoreX Engine (Cross-Platform) MediaPlayerCoreX is our cross-platform solution that works across multiple operating systems. For detailed deployment instructions specific to this engine, refer to the main [Cross-Platform Deployment Guide](../deployment-x/index.md). ### MediaPlayerCore Engine (Windows-Only) The MediaPlayerCore engine is optimized specifically for Windows environments. When deploying applications that use this engine on computers without the SDK pre-installed, you must include the necessary SDK components with your application. > **Important**: For AnyCPU applications, you should deploy both x86 and x64 redistributables to ensure compatibility across different system architectures. ## Deployment Options There are three primary methods for deploying the Media Player SDK .Net components: 1. Using NuGet packages (recommended for most scenarios) 2. Using automatic silent installers (requires administrative privileges) 3. Manual installation (for complete control over the deployment process) ## NuGet Package Deployment NuGet packages provide the simplest deployment method, automatically handling the inclusion of necessary files in your application folder during the build process. ### Required NuGet Packages #### Core Packages (Always Required) * **SDK Base Package**: * [x86 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.Base.x86/) * [x64 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.Base.x64/) * **Media Player SDK Package**: * [x86 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MediaPlayer.x86/) * [x64 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MediaPlayer.x64/) #### Feature-Specific Packages (Add as Needed) ##### Media Format Support * **FFMPEG Package** (for file playback using FFMPEG source mode): * [x86 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.FFMPEG.x86/) * [x64 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.FFMPEG.x64/) * **MP4 Output Package**: * [x86 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MP4.x86/) * [x64 Version](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.MP4.x64/) * **WebM Output Package**: * [x86 Version](https://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.WebM.x86/) ##### Source Support * **VLC Source Package** (for file/IP camera sources): * [x86 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VLC.x86/) * [x64 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.VLC.x64/) ##### Audio Format Support * **XIPH Formats Package** (Ogg, Vorbis, FLAC output/source): * [x86 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.XIPH.x86/) * [x64 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.XIPH.x64/) ##### Filter Support * **LAV Filters Package**: * [x86 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.LAV.x86/) * [x64 Version](http://www.nuget.org/packages/VisioForge.DotNet.Core.Redist.LAV.x64/) ## Automatic Silent Installers For scenarios where you prefer installer-based deployment, the SDK offers automatic silent installers that require administrative privileges. ### Available Installers #### Core Components * **Base Package** (always required): * [x86 Installer](http://files.visioforge.com/redists_net/redist_dotnet_base_x86.exe) * [x64 Installer](http://files.visioforge.com/redists_net/redist_dotnet_base_x64.exe) #### Media Format Support * **FFMPEG Package** (for file/IP camera sources): * [x86 Installer](http://files.visioforge.com/redists_net/redist_dotnet_ffmpeg_x86.exe) * [x64 Installer](http://files.visioforge.com/redists_net/redist_dotnet_ffmpeg_x64.exe) #### Source Support * **VLC Source Package** (for file/IP camera sources): * [x86 Installer](http://files.visioforge.com/redists_net/redist_dotnet_vlc_x86.exe) * [x64 Installer](http://files.visioforge.com/redists_net/redist_dotnet_vlc_x64.exe) #### Audio Format Support * **XIPH Formats Package** (Ogg, Vorbis, FLAC output/source): * [x86 Installer](http://files.visioforge.com/redists_net/redist_dotnet_xiph_x86.exe) * [x64 Installer](http://files.visioforge.com/redists_net/redist_dotnet_xiph_x64.exe) #### Filter Support * **LAV Filters Package**: * [x86 Installer](http://files.visioforge.com/redists_net/redist_dotnet_lav_x86.exe) * [x64 Installer](http://files.visioforge.com/redists_net/redist_dotnet_lav_x64.exe) > **Note**: To uninstall any installed package, run the executable with administrative privileges using the parameters: `/x //` ## Manual Installation For advanced deployment scenarios requiring precise control over component installation, follow these steps: ### Step 1: Runtime Dependencies * **With Administrative Privileges**: Install the VC++ 2022 (v143) runtime (x86/x64) and OpenMP runtime DLLs using redistributable executables or MSM modules. * **Without Administrative Privileges**: Copy the VC++ 2022 (v143) runtime (x86/x64) and OpenMP runtime DLLs directly to your application folder. ### Step 2: Core Components * Copy the VisioForge_MFP/VisioForge_MFPX (or x64 versions) DLLs from the Redist\Filters directory to your application folder. ### Step 3: .NET Assemblies * Either copy the .NET assemblies to your application folder or install them to the Global Assembly Cache (GAC). ### Step 4: DirectShow Filters * Copy and COM-register SDK DirectShow filters using [regsvr32.exe](https://support.microsoft.com/en-us/help/249873/how-to-use-the-regsvr32-tool-and-troubleshoot-regsvr32-error-messages) or another suitable method. ### Step 5: Environment Configuration * Add the folder containing the filters to the system PATH environment variable if your application executable is located in a different directory. ## DirectShow Filter Configuration The SDK uses various DirectShow filters for specific functionality. Below is a comprehensive list organized by feature category: ### Basic Feature Filters * VisioForge_Video_Effects_Pro.ax * VisioForge_MP3_Splitter.ax * VisioForge_H264_Decoder.ax * VisioForge_Audio_Mixer.ax ### Audio Effect Filters * VisioForge_Audio_Effects_4.ax (legacy audio effects) ### Streaming Support Filters #### RTSP Streaming * VisioForge_RTSP_Sink.ax * MP4 filters (legacy/modern, excluding muxer) #### SSF Streaming * VisioForge_SSF_Muxer.ax * MP4 filters (legacy/modern, excluding muxer) ### Source Filters #### VLC Source * VisioForge_VLC_Source.ax * Complete Redist\VLC folder with COM registration * VLC_PLUGIN_PATH environment variable pointing to VLC\plugins folder #### FFMPEG Source * VisioForge_FFMPEG_Source.ax * Complete Redist\FFMPEG folder, added to the Windows PATH variable #### Memory Source * VisioForge_AsyncEx.ax #### WebM Decoding * VisioForge_WebM_Ogg_Source.ax * VisioForge_WebM_Source.ax * VisioForge_WebM_Split.ax * VisioForge_WebM_Vorbis_Decoder.ax * VisioForge_WebM_VP8_Decoder.ax * VisioForge_WebM_VP9_Decoder.ax #### Network Streaming Sources * VisioForge_RTSP_Source.ax * VisioForge_RTSP_Source_Live555.ax * FFMPEG, VLC or LAV filters #### Audio Format Sources * VisioForge_Xiph_FLAC_Source.ax (FLAC source) * VisioForge_Xiph_Ogg_Demux2.ax (Ogg Vorbis source) * VisioForge_Xiph_Vorbis_Decoder.ax (Ogg Vorbis source) ### Special Feature Filters #### Video Encryption * VisioForge_Encryptor_v8.ax * VisioForge_Encryptor_v9.ax #### GPU Acceleration * VisioForge_DXP.dll / VisioForge_DXP64.dll (DirectX 11 GPU video effects) #### LAV Source * Complete contents of redist\LAV\x86(x64), with all .ax files registered ### Filter Registration Tip To simplify the COM registration process for all DirectShow filters in a directory, place the "reg_special.exe" file from the SDK redist into the filters folder and run it with administrative privileges. --- For more code samples and examples, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\index.md --- title: Media Player SDK .Net (MediaPlayerCore) description: SDK usage tutorials for VisioForge Media Player SDK .Net sidebar_label: Media Player SDK .Net order: 13 --- # Media Player SDK .Net [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) Media Player SDK .Net is a video player SDK with a wide range of features. SDK can use several decoding engines to play video and audio files, such as FFMPEG, VLC, and DirectShow. Most of the video and audio formats are supported by the FFMPEG engine. You can play files, network streams, 360-degree videos, and, optionally, DVD and Blu-Ray disks. ## Features - Video and audio playback - Video effects - Audio effects - Text overlays - Image overlays - SVG overlays - Brightness, contrast, saturation, hue, and other video adjustments - Sepia, pixelate, grayscale, and other video filters You can check the full list of features on the [product page](https://www.visioforge.com/media-player-sdk-net). ## Sample applications You can use WPF code in WinForms applications and vice versa. Most of code is the same for all UI frameworks, including Avalonia and MAUI. The main difference is the VideoView control available for each UI framework. ### MediaPlayerCoreX engine (cross-platform) MAUI Avalonia - [Simple Media Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK%20X/Avalonia/Simple%20Media%20Player) shows basic playback functionality in Avalonia iOS Android - [Simple Media Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK%20X/Android/MediaPlayer) shows basic playback functionality in Android macOS WPF WinForms ### MediaPlayerCore engine (Windows only) #### WPF - [Simple Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WPF/CSharp/Simple%20Player%20Demo) shows basic playback functionality - [Main Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WPF/CSharp/Main%20Demo) shows all features of the SDK - [Nvidia Maxine Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WPF/CSharp/Nvidia%20Maxine%20Player) uses the Nvidia Maxine engine - [Skinned Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WPF/CSharp/Skinned%20Player) shows how to use custom skins - [madVR Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WPF/CSharp/madVR%20Demo) uses the madVR video renderer #### WinForms - [Audio Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Audio%20Player) shows how to play audio files - [DVD Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/DVD%20Player) shows how to play DVDs - [Encrypted Memory Playback Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Encrypted%20Memory%20Playback%20Demo) shows how to play encrypted file from the memory - [Karaoke Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Karaoke%20Demo) shows how to play audio karaoke files - [Main Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Main%20Demo) shows all features of the SDK - [Memory Stream](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Memory%20Stream) shows how to play files from the memory - [Multiple Video Streams](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Multiple%20Video%20Streams) shows how to play filles with multiple video streams - [Seamless Playback](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Seamless%20Playback) shows how to play files without delays - [Simple Video Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Simple%20Video%20Player) shows basic playback functionality - [Two Windows](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Two%20Windows) shows how to play files in two windows - [VR 360 Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/VR%20360%20Demo) shows how to play 360-degree videos - [Video Mixing Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/Video%20Mixing%20Demo) shows how to mix video files - [YouTube Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/YouTube%20Player%20Demo) shows how to play YouTube videos (with open license) - [madVR Demo](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinForms/CSharp/madVR%20Demo) uses the madVR to render video #### WinUI - [Simple Media Player](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/WinUI/CSharp/Simple%20Media%20Player%20WinUI) shows basic playback functionality #### Code snippets - [Memory Playback](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/_CodeSnippets/memory-playback) shows how to play files from the memory - [Read File Info](https://github.com/visioforge/.Net-SDK-s-samples/tree/master/Media%20Player%20SDK/_CodeSnippets/read-file-info) shows how to read file information ## Documentation - [Code samples](code-samples/index.md) - [Deployment](deployment.md) - [API](https://api.visioforge.com/dotnet/api/index.html) ## Links - [Changelog](../changelog.md) - [End User License Agreement](../../eula.md) ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\code-samples\get-frame-from-video-file.md --- title: Extracting Video Frames in .NET - Complete Guide description: Learn how to extract and capture specific frames from video files using .NET libraries. This tutorial covers multiple approaches with code examples for both Windows-specific and cross-platform solutions for developers working with video processing. sidebar_label: Extract Video Frames from Files order: 1 --- # Extracting Video Frames from Video Files in .NET [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) Video frame extraction is a common requirement in many multimedia applications. Whether you're building a video editing tool, creating thumbnails, or performing video analysis, extracting specific frames from video files is an essential capability. This guide explains different approaches to capturing frames from video files in .NET applications. ## Why Extract Video Frames? There are numerous use cases for video frame extraction: - Creating thumbnail images for video galleries - Extracting key frames for video analysis - Generating preview images at specific timestamps - Building video editing tools with frame-by-frame precision - Creating timelapse sequences from video footage - Capturing still images from video recordings ## Understanding Video Frame Extraction Video files contain sequences of frames displayed at specific intervals to create the illusion of motion. When extracting a frame, you're essentially capturing a single image at a specific timestamp within the video. This process involves: 1. Opening the video file 2. Seeking to the specific timestamp 3. Decoding the frame data 4. Converting it to an image format ## Frame Extraction Methods in .NET There are several approaches to extract frames from video files in .NET, depending on your requirements and environment. ### Using Windows-Specific SDK Components For Windows-only applications, the classic SDK components offer straightforward methods for frame extraction: ```csharp // Using VideoEditCore for frame extraction using VisioForge.Core.VideoEdit; public void ExtractFrameWithVideoEditCore() { var videoEdit = new VideoEditCore(); var bitmap = videoEdit.Helpful_GetFrameFromFile("C:\\Videos\\sample.mp4", TimeSpan.FromSeconds(5)); bitmap.Save("C:\\Output\\frame.png"); } // Using MediaPlayerCore for frame extraction using VisioForge.Core.MediaPlayer; public void ExtractFrameWithMediaPlayerCore() { var mediaPlayer = new MediaPlayerCore(); var bitmap = mediaPlayer.Helpful_GetFrameFromFile("C:\\Videos\\sample.mp4", TimeSpan.FromSeconds(10)); bitmap.Save("C:\\Output\\frame.png"); } ``` The `Helpful_GetFrameFromFile` method simplifies the process by handling the file opening, seeking, and frame decoding operations in a single call. ### Cross-Platform Solutions with X-Engine Modern .NET applications often need to run on multiple platforms. The X-engine provides cross-platform capabilities for video frame extraction: #### Extracting Frames as System.Drawing.Bitmap The most common approach is to extract frames as `System.Drawing.Bitmap` objects: ```csharp using VisioForge.Core.MediaInfo; public void ExtractFrameAsBitmap() { // Extract the frame at the beginning of the video (TimeSpan.Zero) var bitmap = MediaInfoReaderX.GetFileSnapshotBitmap("C:\\Videos\\sample.mp4", TimeSpan.Zero); // Extract a frame at 30 seconds into the video var frame30sec = MediaInfoReaderX.GetFileSnapshotBitmap("C:\\Videos\\sample.mp4", TimeSpan.FromSeconds(30)); // Save the extracted frame bitmap.Save("C:\\Output\\first-frame.png"); frame30sec.Save("C:\\Output\\frame-30sec.png"); } ``` #### Extracting Frames as SkiaSharp Bitmaps For applications using SkiaSharp for graphics processing, you can extract frames directly as `SKBitmap` objects: ```csharp using VisioForge.Core.MediaInfo; using SkiaSharp; public void ExtractFrameAsSkiaBitmap() { // Extract the frame at 15 seconds into the video var skBitmap = MediaInfoReaderX.GetFileSnapshotSKBitmap("C:\\Videos\\sample.mp4", TimeSpan.FromSeconds(15)); // Work with the SKBitmap using (var image = SKImage.FromBitmap(skBitmap)) using (var data = image.Encode(SKEncodedImageFormat.Png, 100)) using (var stream = File.OpenWrite("C:\\Output\\frame-skia.png")) { data.SaveTo(stream); } } ``` #### Working with Raw RGB Data For more advanced scenarios or when you need direct pixel manipulation, you can extract frames as RGB byte arrays: ```csharp using VisioForge.Core.MediaInfo; public void ExtractFrameAsRGBArray() { // Extract the frame at 20 seconds as RGB byte array var rgbData = MediaInfoReaderX.GetFileSnapshotRGB("C:\\Videos\\sample.mp4", TimeSpan.FromSeconds(20)); // Process the RGB data as needed // The format is typically a byte array with R, G, B values for each pixel // You would also need to know the frame width and height to properly interpret the data } ``` ## Best Practices for Video Frame Extraction When implementing video frame extraction in your applications, consider these best practices: ### Performance Considerations - Extracting frames can be CPU-intensive, especially for high-resolution videos - Consider implementing caching mechanisms for frequently accessed frames - For batch extraction, implement parallel processing where appropriate ```csharp // Example of parallel frame extraction public void ExtractMultipleFramesInParallel(string videoPath, TimeSpan[] timestamps) { Parallel.ForEach(timestamps, timestamp => { var bitmap = MediaInfoReaderX.GetFileSnapshotBitmap(videoPath, timestamp); bitmap.Save($"C:\\Output\\frame-{timestamp.TotalSeconds}.png"); }); } ``` ### Error Handling Always implement proper error handling when working with video files: ```csharp public Bitmap SafeExtractFrame(string videoPath, TimeSpan position) { try { return MediaInfoReaderX.GetFileSnapshotBitmap(videoPath, position); } catch (FileNotFoundException) { Console.WriteLine("Video file not found"); } catch (InvalidOperationException) { Console.WriteLine("Invalid position in video"); } catch (Exception ex) { Console.WriteLine($"Error extracting frame: {ex.Message}"); } return null; } ``` ### Memory Management Proper memory management is crucial, especially when working with large video files: ```csharp public void ExtractFrameWithProperDisposal() { Bitmap bitmap = null; try { bitmap = MediaInfoReaderX.GetFileSnapshotBitmap("C:\\Videos\\sample.mp4", TimeSpan.FromSeconds(5)); // Process the bitmap... } finally { bitmap?.Dispose(); } } ``` ## Common Applications Frame extraction is used in various multimedia applications: - **Video Players**: Generating preview thumbnails - **Media Libraries**: Creating video thumbnails for gallery views - **Video Analysis**: Extracting frames for computer vision processing - **Content Management**: Creating preview images for video assets - **Video Editing**: Providing visual reference for timeline editing ## Conclusion Extracting frames from video files is a powerful capability for .NET developers working with multimedia content. Whether you're building Windows-specific applications or cross-platform solutions, the methods described in this guide provide efficient ways to capture and work with video frames. By understanding the different approaches and following best practices, you can implement robust frame extraction functionality in your .NET applications. --- For more code samples and examples, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\code-samples\index.md --- title: .NET Media Player SDK Code Examples & Tutorials description: Explore our extensive library of .NET Media Player SDK code examples for C# and VB.NET developers. Learn to implement video playback, frame extraction, playlists, and more in WinForms, WPF, Console, and Service applications with detailed tutorials and practical implementations. sidebar_label: Code Examples --- # .NET Media Player SDK Implementation Examples [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Getting Started with Code Examples This resource contains a rich collection of implementation examples for the Media Player SDK .Net, demonstrating the powerful capabilities and diverse functionalities available to developers working with video and audio playback in .NET applications. ### Multi-Language Support Our examples are meticulously developed in both C# and VB.Net programming languages, showcasing the flexibility and developer-friendly nature of the MediaPlayerCore engine. Each example has been thoughtfully crafted to illustrate real-world scenarios and implementation strategies, enabling developers to quickly master the core concepts needed for effective SDK integration. ### Cross-Platform Application Integration The provided code examples cover an extensive range of application frameworks, including: - **WinForms applications** for traditional desktop interfaces - **WPF applications** for modern UI experiences - **Console applications** for command-line utilities - **Windows Service applications** for background processing Whether you're building feature-rich desktop software, efficient command-line tools, or robust background services, these examples provide valuable guidance throughout your development journey. The examples serve as both learning resources and practical references for troubleshooting and performance optimization in your media applications. ## Featured Implementation Examples ### Video Processing Examples - [How to get a specific frame from a video file?](get-frame-from-video-file.md) - [How to play a fragment of the source file?](play-fragment-file.md) - [How to show the first frame?](show-first-frame.md) ### Advanced Playback Examples - [Memory playback implementation](memory-playback.md) - [Playlist API integration](playlist-api.md) - [Previous frame and reverse video playback](reverse-video-playback.md) --- ## Additional Resources For a more extensive collection of code examples and implementation scenarios, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\code-samples\memory-playback.md --- title: Memory Playback Implementation in .NET Media Player SDK description: Learn how to implement memory-based media playback in C# applications using stream objects, byte arrays, and memory management techniques. This guide provides code examples and best practices for efficient memory handling during audio and video playback. sidebar_label: Memory Playback order: 2 --- # Memory-Based Media Playback in .NET Applications [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) ## Introduction to Memory-Based Media Playback Memory-based playback offers a powerful alternative to traditional file-based media consumption in .NET applications. By loading and processing media directly from memory, developers can achieve more responsive playback, enhanced security through reduced file access, and greater flexibility in handling different data sources. This guide explores the various approaches to implementing memory-based playback in your .NET applications, complete with code examples and best practices. ## Advantages of Memory-Based Media Playback Before diving into implementation details, let's understand why memory-based playback is valuable: - **Improved performance**: By eliminating file I/O operations during playback, your application can deliver smoother media experiences. - **Enhanced security**: Media content doesn't need to be stored as accessible files on the filesystem. - **Stream processing**: Work with data from various sources, including network streams, encrypted content, or dynamically generated media. - **Virtual file systems**: Implement custom media access patterns without filesystem dependencies. - **In-memory transformations**: Apply real-time modifications to media content before playback. ## Implementation Approaches ### Stream-Based Playback from Existing Files The most straightforward approach to memory-based playback begins with existing media files that you load into memory streams. This technique is ideal when you want the performance benefits of memory playback while still maintaining your content in traditional file formats. ```cs // Create a FileStream from an existing media file var fileStream = new FileStream(mediaFilePath, FileMode.Open); // Convert to a managed IStream for the media player var managedStream = new ManagedIStream(fileStream); // Configure stream settings for your content bool videoPresent = true; bool audioPresent = true; // Set the memory stream as the media source MediaPlayer1.Source_MemoryStream = new MemoryStreamSource( managedStream, videoPresent, audioPresent, fileStream.Length ); // Set the player to memory playback mode MediaPlayer1.Source_Mode = MediaPlayerSourceMode.Memory_DS; // Start playback await MediaPlayer1.PlayAsync(); ``` When using this approach, remember to properly dispose of the FileStream when playback is complete to prevent resource leaks. ### Byte Array-Based Playback For scenarios where your media content already exists as a byte array in memory (perhaps downloaded from a network source or decrypted from protected storage), you can play directly from this data structure: ```cs // Assume 'mediaBytes' is a byte array containing your media content byte[] mediaBytes = GetMediaContent(); // Create a MemoryStream from the byte array using (var memoryStream = new MemoryStream(mediaBytes)) { // Convert to a managed IStream var managedStream = new ManagedIStream(memoryStream); // Configure stream settings based on your content bool videoPresent = true; // Set to false for audio-only content bool audioPresent = true; // Set to false for video-only content // Create and assign the memory stream source MediaPlayer1.Source_MemoryStream = new MemoryStreamSource( managedStream, videoPresent, audioPresent, memoryStream.Length ); // Set memory playback mode MediaPlayer1.Source_Mode = MediaPlayerSourceMode.Memory_DS; // Begin playback await MediaPlayer1.PlayAsync(); // Additional playback handling code... } ``` This technique is particularly useful when dealing with content that should never be written to disk for security reasons. ### Advanced: Custom Stream Implementations For more complex scenarios, you can implement custom stream handlers that provide media data from any source you can imagine: ```cs // Example of a custom stream provider public class CustomMediaStreamProvider : Stream { private byte[] _buffer; private long _position; // Constructor might take a custom data source public CustomMediaStreamProvider(IDataSource dataSource) { // Initialize your stream from the data source } // Implement required Stream methods public override int Read(byte[] buffer, int offset, int count) { // Custom implementation to provide data } // Other required Stream overrides // ... } // Usage example: var customStream = new CustomMediaStreamProvider(myDataSource); var managedStream = new ManagedIStream(customStream); MediaPlayer1.Source_MemoryStream = new MemoryStreamSource( managedStream, hasVideo, hasAudio, streamLength ); ``` ## Performance Considerations When implementing memory-based playback, keep these performance factors in mind: 1. **Memory allocation**: For large media files, ensure your application has sufficient memory available. 2. **Buffering strategy**: Consider implementing a sliding buffer for very large files rather than loading the entire content into memory. 3. **Garbage collection impact**: Large memory allocations can trigger garbage collection, potentially causing playback stuttering. 4. **Thread synchronization**: If providing stream data from another thread or async source, ensure proper synchronization to prevent playback issues. ## Error Handling Best Practices Robust error handling is critical when implementing memory-based playback: ```cs try { var fileStream = new FileStream(mediaFilePath, FileMode.Open); var managedStream = new ManagedIStream(fileStream); MediaPlayer1.Source_MemoryStream = new MemoryStreamSource( managedStream, true, true, fileStream.Length ); MediaPlayer1.Source_Mode = MediaPlayerSourceMode.Memory_DS; await MediaPlayer1.PlayAsync(); } catch (FileNotFoundException ex) { LogError("Media file not found", ex); DisplayUserFriendlyError("The requested media file could not be found."); } catch (UnauthorizedAccessException ex) { LogError("Access denied to media file", ex); DisplayUserFriendlyError("You don't have permission to access this media file."); } catch (Exception ex) { LogError("Unexpected playback error", ex); DisplayUserFriendlyError("An error occurred during media playback."); } finally { // Ensure resources are properly cleaned up CleanupResources(); } ``` ## Required Dependencies To successfully implement memory-based playback using the Media Player SDK, ensure you have these dependencies: - Base redistributable components - SDK redistributable components For more information on installing or deploying these dependencies to your users' systems, refer to our [deployment guide](../deployment.md). ## Advanced Scenarios ### Encrypted Media Playback For applications dealing with protected content, you can integrate decryption into your memory-based playback pipeline: ```cs // Read encrypted content byte[] encryptedContent = File.ReadAllBytes(encryptedMediaPath); // Decrypt the content byte[] decryptedContent = DecryptMedia(encryptedContent, encryptionKey); // Play from decrypted memory without writing to disk using (var memoryStream = new MemoryStream(decryptedContent)) { var managedStream = new ManagedIStream(memoryStream); // Continue with standard memory playback setup... } ``` ### Network Streaming to Memory Pull content from network sources directly into memory for playback: ```cs using (HttpClient client = new HttpClient()) { // Download media content byte[] mediaContent = await client.GetByteArrayAsync(mediaUrl); // Play from memory using (var memoryStream = new MemoryStream(mediaContent)) { // Continue with standard memory playback setup... } } ``` ## Conclusion Memory-based media playback provides a flexible and powerful approach for .NET applications requiring enhanced performance, security, or custom media handling. By understanding the implementation options and following best practices for resource management, you can deliver smooth and responsive media experiences to your users. For more sample code and advanced implementations, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\code-samples\play-fragment-file.md --- title: Play Video & Audio File Segments in C# .NET Apps description: Complete guide to implementing precise media fragment playback in your C# applications using .NET Media Player SDK. Learn how to create time-based segments in videos and audio files with step-by-step code examples for both Windows and cross-platform applications. sidebar_label: Playing Media File Fragments order: 3 --- # Playing Media File Fragments: Implementation Guide for .NET Developers [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) When developing media applications, one frequently requested feature is the ability to play specific segments of a video or audio file. This functionality is crucial for creating video editors, highlight reels, educational platforms, or any application requiring precise media segment playback. ## Understanding Fragment Playback in .NET Applications Fragment playback allows you to define specific time segments of a media file for playback, effectively creating clips without modifying the source file. This technique is particularly useful when you need to: - Create preview segments from longer media files - Focus on specific sections of instructional videos - Create looping segments for demonstrations or presentations - Build clip-based media players for sports highlights or video compilations - Implement training applications that focus on specific video segments The Media Player SDK .NET provides two primary engines for implementing fragment playback, each with its own approach and platform compatibility considerations. ## Windows-Only Implementation: MediaPlayerCore Engine The MediaPlayerCore engine provides a straightforward implementation for Windows applications. This solution works across WPF, WinForms, and console applications but is limited to Windows operating systems. ### Setting Up Fragment Playback To implement fragment playback with the MediaPlayerCore engine, you'll need to follow three key steps: 1. Activate the selection mode on your MediaPlayer instance 2. Define the starting position of your fragment (in milliseconds) 3. Define the ending position of your fragment (in milliseconds) ### Implementation Example The following C# code demonstrates how to configure fragment playback to play only the segment between 2000ms and 5000ms of your source file: ```csharp // Step 1: Enable fragment selection mode MediaPlayer1.Selection_Active = true; // Step 2: Set the starting position to 2000 milliseconds (2 seconds) MediaPlayer1.Selection_Start = TimeSpan.FromMilliseconds(2000); // Step 3: Set the ending position to 5000 milliseconds (5 seconds) MediaPlayer1.Selection_Stop = TimeSpan.FromMilliseconds(5000); // When you call Play() or PlayAsync(), only the specified fragment will play ``` When your application calls the Play or PlayAsync method after setting these properties, the player will automatically jump to the selection start position and stop playback when it reaches the selection end position. ### Required Redistributables for Windows Implementation For the MediaPlayerCore engine implementation to function correctly, you must include: - Base redistributable package - SDK redistributable package These packages contain the necessary components for the Windows-based playback functionality. For detailed information on deploying these redistributables to end-user machines, refer to the [deployment documentation](../deployment.md). ## Cross-Platform Implementation: MediaPlayerCoreX Engine For developers requiring fragment playback functionality across multiple platforms, the MediaPlayerCoreX engine provides a more versatile solution. This implementation works across Windows, macOS, iOS, Android, and Linux environments. ### Setting Up Cross-Platform Fragment Playback The cross-platform implementation follows a similar conceptual approach but uses different property names. The key steps include: 1. Creating a MediaPlayerCoreX instance 2. Loading your media source 3. Defining the segment start and stop positions 4. Initiating playback ### Cross-Platform Implementation Example The following example demonstrates how to implement fragment playback in a cross-platform .NET application: ```csharp // Step 1: Create a new instance of MediaPlayerCoreX with your video view MediaPlayerCoreX MediaPlayer1 = new MediaPlayerCoreX(VideoView1); // Step 2: Set the source media file var fileSource = await UniversalSourceSettings.CreateAsync(new Uri("video.mkv")); await MediaPlayer1.OpenAsync(fileSource); // Step 3: Define the segment start time (2 seconds from beginning) MediaPlayer1.Segment_Start = TimeSpan.FromMilliseconds(2000); // Step 4: Define the segment end time (5 seconds from beginning) MediaPlayer1.Segment_Stop = TimeSpan.FromMilliseconds(5000); // Step 5: Start playback of the defined segment await MediaPlayer1.PlayAsync(); ``` This implementation uses the Segment_Start and Segment_Stop properties instead of the Selection properties used in the Windows-only implementation. Also note the asynchronous approach used in the cross-platform example, which improves UI responsiveness. ## Advanced Fragment Playback Techniques ### Dynamic Fragment Adjustment In more complex applications, you might need to adjust fragment boundaries dynamically. Both engines support changing the segment boundaries during runtime: ```csharp // For Windows-only implementation private void UpdateFragmentBoundaries(int startMs, int endMs) { MediaPlayer1.Selection_Start = TimeSpan.FromMilliseconds(startMs); MediaPlayer1.Selection_Stop = TimeSpan.FromMilliseconds(endMs); // If playback is in progress, restart it to apply the new boundaries if (MediaPlayer1.State == PlaybackState.Playing) { MediaPlayer1.Position_Set(MediaPlayer1.Selection_Start); } } // For cross-platform implementation private async Task UpdateFragmentBoundariesAsync(int startMs, int endMs) { MediaPlayer1.Segment_Start = TimeSpan.FromMilliseconds(startMs); MediaPlayer1.Segment_Stop = TimeSpan.FromMilliseconds(endMs); // If playback is in progress, restart from the new start position if (await MediaPlayer1.StateAsync() == PlaybackState.Playing) { await MediaPlayer1.Position_SetAsync(MediaPlayer1.Segment_Start); } } ``` ### Multiple Fragment Playback For applications that need to play multiple fragments sequentially, you can implement a fragment queue: ```csharp public class MediaFragment { public TimeSpan StartTime { get; set; } public TimeSpan EndTime { get; set; } } private Queue fragmentQueue = new Queue(); private bool isProcessingQueue = false; // Add fragments to the queue public void EnqueueFragment(TimeSpan start, TimeSpan end) { fragmentQueue.Enqueue(new MediaFragment { StartTime = start, EndTime = end }); if (!isProcessingQueue && MediaPlayer1 != null) { PlayNextFragment(); } } // Process the fragment queue private async void PlayNextFragment() { if (fragmentQueue.Count == 0) { isProcessingQueue = false; return; } isProcessingQueue = true; var fragment = fragmentQueue.Dequeue(); // Set the fragment boundaries MediaPlayer1.Segment_Start = fragment.StartTime; MediaPlayer1.Segment_Stop = fragment.EndTime; // Subscribe to completion event for this fragment MediaPlayer1.OnStop += (s, e) => PlayNextFragment(); // Start playback await MediaPlayer1.PlayAsync(); } ``` ### Performance Considerations For optimal performance when using fragment playback, consider the following tips: 1. For frequent seeking between fragments, use formats with good keyframe density 2. MP4 and MOV files generally perform better for fragment-heavy applications 3. Setting fragments at keyframe boundaries improves seeking performance 4. Consider preloading files before setting fragment boundaries 5. On mobile platforms, keep fragments reasonably sized to avoid memory pressure ## Conclusion Implementing fragment playback in your .NET media applications provides substantial flexibility and enhanced user experience. Whether you're developing for Windows only or targeting multiple platforms, the Media Player SDK .NET offers robust solutions for precise media segment playback. By leveraging the techniques demonstrated in this guide, you can create sophisticated media experiences that allow users to focus on exactly the content they need, without the overhead of editing or splitting source files. For more code samples and implementations, visit our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples) where you'll find comprehensive examples of media player implementations, including fragment playback and other advanced media features. ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\code-samples\playlist-api.md --- title: Media Player SDK .Net Playlist API Guide description: Learn how to implement powerful playlist functionality in your .NET applications using our Media Player SDK. Complete code examples for WinForms, WPF, and console applications with step-by-step implementation guide. sidebar_label: Playlist API --- # Complete Guide to Playlist API Implementation in .NET [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge variant="dark" size="xl" text="MediaPlayerCore"] ## Introduction to Playlist Management The Playlist API provides a powerful and flexible way to manage media content in your .NET applications. Whether you're developing a music player, video application, or any media-centric software, efficient playlist management is essential for delivering a seamless user experience. This guide covers everything you need to know about implementing playlist functionality using the MediaPlayerCore component. You'll learn how to create playlists, navigate between tracks, handle playlist events, and optimize performance in various .NET environments. ## Key Features and Benefits - **Simple Integration**: Easy-to-implement API that integrates seamlessly with existing .NET applications - **Format Compatibility**: Support for a wide range of audio and video formats - **Cross-Platform**: Works consistently across WinForms, WPF, and console applications - **Performance Optimized**: Built for efficient memory usage and responsive playback - **Event-Driven Architecture**: Rich event system for building reactive UI experiences ## Getting Started with Playlist API Before diving into specific methods, ensure you have properly initialized the MediaPlayer component in your application. The following sections contain practical code examples that you can implement directly in your project. ### Creating Your First Playlist Creating a playlist is the first step in managing multiple media files. The API provides straightforward methods to add files to your playlist collection: ```csharp // Initialize the media player (assuming you've added the component to your form) // this.mediaPlayer1 = new MediaPlayer(); // Add individual files to the playlist this.mediaPlayer1.Playlist_Add(@"c:\media\intro.mp4"); this.mediaPlayer1.Playlist_Add(@"c:\media\main_content.mp4"); this.mediaPlayer1.Playlist_Add(@"c:\media\conclusion.mp4"); // Start playback from the first item this.mediaPlayer1.Play(); ``` This approach allows you to build playlists programmatically, which is ideal for applications where playlist content is determined at runtime. ## Core Playlist Operations ### Navigating Through Playlist Items Once you've created a playlist, your users will need to navigate between items. The API provides intuitive methods for moving to the next or previous file: ```csharp // Play the next file in the playlist this.mediaPlayer1.Playlist_PlayNext(); // Play the previous file in the playlist this.mediaPlayer1.Playlist_PlayPrevious(); ``` These methods automatically handle the transition between media files, including stopping the current playback and starting the new item. ### Managing Playlist Content During application runtime, you may need to modify the playlist by removing specific items or clearing it entirely: ```csharp // Remove a specific file from the playlist this.mediaPlayer1.Playlist_Remove(@"c:\media\intro.mp4"); // Clear all items from the playlist this.mediaPlayer1.Playlist_Clear(); ``` This dynamic content management allows your application to adapt to user preferences or changing requirements on the fly. ### Retrieving Playlist Information Accessing information about the current state of the playlist is crucial for building an informative user interface: ```csharp // Get the current file's index (0-based) int currentIndex = this.mediaPlayer1.Playlist_GetPosition(); // Get the total number of files in the playlist int totalFiles = this.mediaPlayer1.Playlist_GetCount(); // Get a specific filename by its index string fileName = this.mediaPlayer1.Playlist_GetFilename(1); // Gets the second file // Display current playback information string statusMessage = $"Playing file {currentIndex + 1} of {totalFiles}: {fileName}"; ``` These methods enable you to create dynamic interfaces that reflect the current state of media playback. ## Advanced Playlist Control ### Resetting and Repositioning For more precise control over playlist navigation, you can reset the playlist or jump to a specific position: ```csharp // Reset the playlist to start from the first file this.mediaPlayer1.Playlist_Reset(); // Jump to a specific position in the playlist (0-based index) this.mediaPlayer1.Playlist_SetPosition(2); // Jump to the third item ``` These methods are particularly useful for implementing features like "restart playlist" or allowing users to select specific items from a playlist view. ### Custom Event Handling for Playlist Navigation To create a responsive application, you'll want to implement custom event handling for playlist navigation. Since the MediaPlayerCore doesn't have a dedicated playlist item changed event, you can create your own tracking mechanism using the existing events: ```csharp private int _lastPlaylistIndex = -1; // Track playlist position changes when playback starts private void mediaPlayer1_OnStart(object sender, EventArgs e) { int currentIndex = this.mediaPlayer1.Playlist_GetPosition(); if (currentIndex != _lastPlaylistIndex) { _lastPlaylistIndex = currentIndex; // Handle playlist item change string currentFile = this.mediaPlayer1.Playlist_GetFilename(currentIndex); UpdatePlaylistUI(currentIndex, currentFile); } } // Also track when a new file playback starts private void mediaPlayer1_OnNewFilePlaybackStarted(object sender, NewFilePlaybackEventArgs e) { int currentIndex = this.mediaPlayer1.Playlist_GetPosition(); _lastPlaylistIndex = currentIndex; // Handle playlist item change string currentFile = this.mediaPlayer1.Playlist_GetFilename(currentIndex); UpdatePlaylistUI(currentIndex, currentFile); } // Handle playlist completion private void mediaPlayer1_OnPlaylistFinished(object sender, EventArgs e) { // Handle playlist completion this.lblPlaybackStatus.Text = "Playlist finished"; // Optionally reset or loop playlist // this.mediaPlayer1.Playlist_Reset(); // this.mediaPlayer1.Play(); } private void UpdatePlaylistUI(int index, string filename) { // Update UI elements with new information this.lblCurrentTrack.Text = $"Now playing: {Path.GetFileName(filename)}"; this.lblTrackNumber.Text = $"Track {index + 1} of {this.mediaPlayer1.Playlist_GetCount()}"; // Update playlist selection in UI // ... } ``` This approach allows you to detect and respond to playlist navigation events in your application by subscribing to the actual events provided by MediaPlayerCore: ```csharp // Subscribe to events this.mediaPlayer1.OnStart += mediaPlayer1_OnStart; this.mediaPlayer1.OnNewFilePlaybackStarted += mediaPlayer1_OnNewFilePlaybackStarted; this.mediaPlayer1.OnPlaylistFinished += mediaPlayer1_OnPlaylistFinished; ``` ### Async Playlist Operations The MediaPlayerCore provides async versions of playlist navigation methods for improved responsiveness: ```csharp // Play the next file asynchronously await this.mediaPlayer1.Playlist_PlayNextAsync(); // Play the previous file asynchronously await this.mediaPlayer1.Playlist_PlayPreviousAsync(); ``` Using these async methods is recommended for UI applications to prevent blocking the main thread during playback transitions. ## Implementation Patterns and Best Practices ### Implementing Repeat and Shuffle Modes Most media players include repeat and shuffle functionality. Here's how to implement these common features: ```csharp private bool repeatEnabled = false; private bool shuffleEnabled = false; private Random random = new Random(); // Handle playlist navigation when media playback stops private void MediaPlayer1_OnStop(object sender, StopEventArgs e) { // Check if this is the end of media (not a manual stop) if (e.Reason == StopReason.EndOfMedia) { if (repeatEnabled) { // Just replay the current item this.mediaPlayer1.Play(); } else if (shuffleEnabled) { // Play a random item int totalFiles = this.mediaPlayer1.Playlist_GetCount(); int randomIndex = random.Next(0, totalFiles); this.mediaPlayer1.Playlist_SetPosition(randomIndex); this.mediaPlayer1.Play(); } else { // Standard behavior: play next if available if (this.mediaPlayer1.Playlist_GetPosition() < this.mediaPlayer1.Playlist_GetCount() - 1) { this.mediaPlayer1.Playlist_PlayNext(); } else { // We've reached the end of the playlist // OnPlaylistFinished will be triggered } } } } // Subscribe to the stop event this.mediaPlayer1.OnStop += MediaPlayer1_OnStop; ``` ### Memory Management for Large Playlists When dealing with large playlists, consider implementing lazy loading techniques: ```csharp // Store playlist information separately for large playlists private List masterPlaylist = new List(); public void LoadLargePlaylist(string[] filePaths) { // Clear existing playlist this.mediaPlayer1.Playlist_Clear(); masterPlaylist.Clear(); // Store all paths masterPlaylist.AddRange(filePaths); // Load only the first batch of items (e.g., 100) int initialBatchSize = Math.Min(100, filePaths.Length); for (int i = 0; i < initialBatchSize; i++) { this.mediaPlayer1.Playlist_Add(filePaths[i]); } // Start playback this.mediaPlayer1.Play(); } // Implement dynamic loading as user approaches the end of loaded items private void CheckAndLoadMoreItems() { int currentPosition = this.mediaPlayer1.Playlist_GetPosition(); int loadedCount = this.mediaPlayer1.Playlist_GetCount(); // If we're near the end of loaded items but have more in master playlist if (currentPosition > loadedCount - 10 && loadedCount < masterPlaylist.Count) { // Load next batch int nextBatchSize = Math.Min(50, masterPlaylist.Count - loadedCount); for (int i = 0; i < nextBatchSize; i++) { this.mediaPlayer1.Playlist_Add(masterPlaylist[loadedCount + i]); } } } ``` ## Cross-Platform Considerations The Playlist API functions consistently across different .NET environments, but there are some platform-specific considerations: ### WPF Implementation Notes When implementing in WPF applications, you'll typically use data binding with your playlist: ```csharp // Create an observable collection to bind to UI private ObservableCollection observablePlaylist = new ObservableCollection(); // Sync the observable collection with the player's playlist private void SyncObservablePlaylist() { observablePlaylist.Clear(); for (int i = 0; i < this.mediaPlayer1.Playlist_GetCount(); i++) { string filename = this.mediaPlayer1.Playlist_GetFilename(i); observablePlaylist.Add(new PlaylistItem { Index = i, FileName = System.IO.Path.GetFileName(filename), FullPath = filename }); } } ``` ## Conclusion The Playlist API provides a robust foundation for building feature-rich media applications in .NET. By using the methods and patterns outlined in this guide, you can create intuitive playlist management systems that enhance the user experience of your application. For more advanced scenarios, explore the additional capabilities of the MediaPlayerCore component, including custom event handling, media metadata extraction, and format-specific optimizations. ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\code-samples\reverse-video-playback.md --- title: Reverse Video Playback for .NET Applications description: Master reverse video playback in .NET applications with detailed C# code examples, frame-by-frame navigation techniques, performance optimization tips, and platform-specific implementations for both cross-platform and Windows-specific scenarios. sidebar_label: Reverse Video Playback order: 4 --- # Implementing Reverse Video Playback in .NET Applications Playing video in reverse is a powerful feature for media applications, allowing users to review content, create unique visual effects, or enhance the user experience with non-linear playback options. This guide provides complete implementations for reverse playback in .NET applications, focusing on both cross-platform and Windows-specific solutions. ## Understanding Reverse Playback Mechanisms Reverse video playback can be achieved through several techniques, each with distinct advantages depending on your application's requirements: 1. **Rate-based reverse playback** - Setting a negative playback rate to reverse the video stream 2. **Frame-by-frame backward navigation** - Manually stepping backward through cached video frames 3. **Buffer-based approaches** - Creating a frame buffer to enable smooth reverse navigation Let's explore how to implement each approach using the Media Player SDK for .NET. ## Cross-Platform Reverse Playback with MediaPlayerCoreX The MediaPlayerCoreX engine provides cross-platform support for reverse video playback with a straightforward implementation. This approach works across Windows, macOS, and other supported platforms. ### Basic Implementation The simplest method for reverse playback involves setting a negative rate value: ```cs // Create new instance of MediaPlayerCoreX MediaPlayerCoreX MediaPlayer1 = new MediaPlayerCoreX(VideoView1); // Set the source file var fileSource = await UniversalSourceSettings.CreateAsync(new Uri("video.mp4")); await MediaPlayer1.OpenAsync(fileSource); // Start normal playback first await MediaPlayer1.PlayAsync(); // Change to reverse playback with normal speed MediaPlayer1.Rate_Set(-1.0); ``` ### Controlling Reverse Playback Speed You can control the reverse playback speed by adjusting the negative rate value: ```cs // Double-speed reverse playback MediaPlayer1.Rate_Set(-2.0); // Half-speed reverse playback (slow motion in reverse) MediaPlayer1.Rate_Set(-0.5); // Quarter-speed reverse playback (very slow motion in reverse) MediaPlayer1.Rate_Set(-0.25); ``` ### Event Handling During Reverse Playback When implementing reverse playback, you may need to handle events differently: ```cs // Subscribe to position change events MediaPlayer1.PositionChanged += (sender, e) => { // Update UI with current position TimeSpan currentPosition = MediaPlayer1.Position_Get(); UpdatePositionUI(currentPosition); }; // Handle reaching the beginning of the video MediaPlayer1.ReachedStart += (sender, e) => { // Stop playback or switch to forward playback MediaPlayer1.Rate_Set(1.0); // Alternatively: await MediaPlayer1.PauseAsync(); }; ``` ## Windows-Specific Frame-by-Frame Reverse Navigation The MediaPlayerCore engine (Windows-only) provides enhanced frame-by-frame control with its frame caching system, allowing precise backward navigation even with codecs that don't natively support it. ### Setting Up Frame Caching Before starting playback, configure the reverse playback cache: ```cs // Configure reverse playback before starting MediaPlayer1.ReversePlayback_CacheSize = 100; // Cache 100 frames MediaPlayer1.ReversePlayback_Enabled = true; // Enable the feature // Start playback await MediaPlayer1.PlayAsync(); ``` ### Navigating Frame by Frame With the cache configured, you can navigate to previous frames: ```cs // Navigate to the previous frame MediaPlayer1.ReversePlayback_PreviousFrame(); // Navigate backward multiple frames for(int i = 0; i < 5; i++) { MediaPlayer1.ReversePlayback_PreviousFrame(); // Optional: add delay between frames for controlled playback await Task.Delay(40); // ~25fps equivalent timing } ``` ### Advanced Frame Cache Configuration For applications with specific memory or performance requirements, you can fine-tune the cache: ```cs // For high-resolution videos, you might need fewer frames to manage memory MediaPlayer1.ReversePlayback_CacheSize = 50; // Reduce cache size // For applications that need extensive backward navigation MediaPlayer1.ReversePlayback_CacheSize = 250; // Increase cache size // Listen for cache-related events MediaPlayer1.ReversePlayback_CacheFull += (sender, e) => { Console.WriteLine("Reverse playback cache is full"); }; ``` ## Implementing UI Controls for Reverse Playback A complete reverse playback implementation typically includes dedicated UI controls: ```cs // Button click handler for reverse playback private async void ReversePlaybackButton_Click(object sender, EventArgs e) { if(MediaPlayer1.State == MediaPlayerState.Playing) { // Toggle between forward and reverse if(MediaPlayer1.Rate_Get() > 0) { MediaPlayer1.Rate_Set(-1.0); UpdateUIForReverseMode(true); } else { MediaPlayer1.Rate_Set(1.0); UpdateUIForReverseMode(false); } } else { // Start playback in reverse await MediaPlayer1.PlayAsync(); MediaPlayer1.Rate_Set(-1.0); UpdateUIForReverseMode(true); } } // Button click handler for frame-by-frame backward navigation private void PreviousFrameButton_Click(object sender, EventArgs e) { // Ensure we're paused first if(MediaPlayer1.State == MediaPlayerState.Playing) { await MediaPlayer1.PauseAsync(); } // Navigate to previous frame MediaPlayer1.ReversePlayback_PreviousFrame(); UpdateFrameCountDisplay(); } ``` ## Performance Considerations Reverse playback can be resource-intensive, especially with high-resolution videos. Consider these optimization techniques: 1. **Limit cache size** for devices with memory constraints 2. **Use hardware acceleration** when available 3. **Monitor performance** during reverse playback with debugging tools 4. **Provide fallback options** for devices that struggle with full-speed reverse playback ```cs // Example of performance monitoring during reverse playback private void MonitorPerformance() { Timer performanceTimer = new Timer(1000); performanceTimer.Elapsed += (s, e) => { if(MediaPlayer1.Rate_Get() < 0) { // Log or display current memory usage, frame rate, etc. LogPerformanceMetrics(); // Adjust settings if needed if(IsMemoryUsageHigh()) { MediaPlayer1.ReversePlayback_CacheSize = Math.Max(10, MediaPlayer1.ReversePlayback_CacheSize / 2); } } }; performanceTimer.Start(); } ``` ## Required Dependencies To ensure proper functionality of reverse playback features, include these dependencies: - Base redistributable package - SDK redistributable package These packages contain the necessary codecs and media processing components to enable smooth reverse playback across different video formats. ## Additional Resources and Advanced Techniques For complex media applications requiring advanced reverse playback features, consider exploring: - Frame extraction and manual rendering for custom effects - Keyframe analysis for optimized navigation - Buffering strategies for smoother reverse playback ## Conclusion Implementing reverse video playback adds significant value to media applications, providing users with enhanced control over content navigation. By following the implementation patterns in this guide, developers can create robust, performant reverse playback experiences in .NET applications. --- Visit our [GitHub](https://github.com/visioforge/.Net-SDK-s-samples) page for more complete code samples and implementation examples. ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\code-samples\show-first-frame.md --- title: Display First Frame in Video Files with .NET description: Learn how to display the first frame of a video file in your .NET applications using the Media Player SDK. Complete C# code examples for WinForms, WPF, and console applications with detailed implementation steps. sidebar_label: How to Show the First Frame? --- # Displaying the First Frame of Video Files in .NET Applications [!badge size="xl" target="blank" variant="info" text="Media Player SDK .Net"](https://www.visioforge.com/media-player-sdk-net) [!badge variant="dark" size="xl" text="VideoCaptureCore"] ## Overview When developing media applications, it's often necessary to preview video content without playing the entire file. This technique is particularly useful for creating thumbnail galleries, video selection screens, or providing users with a visual preview before committing to watching a video. ## Implementation Guide The Media Player SDK .NET provides a simple yet powerful way to display the first frame of any video file. This is achieved through the `Play_PauseAtFirstFrame` property, which when set to `true`, instructs the player to pause immediately after loading the first frame. ### How It Works When the `Play_PauseAtFirstFrame` property is enabled: 1. The player loads the video file 2. Renders the first frame to the video display surface 3. Automatically pauses playback 4. Maintains the first frame on screen until further user action If this property is not enabled (set to `false`), the player will proceed with normal playback after loading. ## Code Implementation ### Basic Example ```cs // create player and configure the file name // ... // set the property to true MediaPlayer1.Play_PauseAtFirstFrame = true; // play the file await MediaPlayer1.PlayAsync(); ``` Resume playback from the first frame: ```cs // resume playback await MediaPlayer1.ResumeAsync(); ``` ## Practical Applications This feature is particularly useful for: - Providing preview capabilities in video editing applications - Generating video poster frames for streaming applications - Implementing "hover to preview" functionality in media browsers ## Required Components To implement this functionality in your application, you'll need: - Base redist package - SDK redist package For more information on distributing these components with your application, see: [How can the required redists be installed or deployed to the user's PC?](../deployment.md) ## Additional Resources Find more code samples and implementation examples in our [GitHub repository](https://github.com/visioforge/.Net-SDK-s-samples). ## Technical Considerations When implementing this feature, keep in mind: - First frame display is nearly instantaneous for most video formats - Resource usage is minimal as the player doesn't buffer beyond the first frame - Works with all supported video formats including MP4, MOV, AVI, and more ---END OF PAGE--- # Local File: .\dotnet\mediaplayer\guides\avalonia-player.md # How to Create a Cross-Platform Media Player using Avalonia MVVM and VisioForge SDK This guide will walk you through the process of building a cross-platform media player application using Avalonia UI with the Model-View-ViewModel (MVVM) pattern and the VisioForge Media Player SDK. The application will be capable of playing video files on Windows, macOS, Linux, Android, and iOS. We will be referencing the `SimplePlayerMVVM` example project, which demonstrates the core concepts and implementation details. `[SCREENSHOT: Final application running on multiple platforms]` ## 1. Prerequisites Before you begin, ensure you have the following installed: * .NET SDK (latest version, e.g., .NET 8 or newer) * An IDE such as Visual Studio, JetBrains Rider, or VS Code with C# and Avalonia extensions. * For Android development: * Android SDK * Java Development Kit (JDK) * For iOS development (requires a macOS machine): * Xcode * Necessary provisioning profiles and certificates. * VisioForge .NET SDK (MediaPlayer SDK X). You can obtain this from the VisioForge website. The necessary packages will be added via NuGet. ## 2. Project Setup This section outlines how to set up the solution structure and include the necessary VisioForge SDK packages. ### 2.1. Solution Structure The `SimplePlayerMVVM` solution consists of several projects: * **SimplePlayerMVVM**: A .NET Standard library containing the core application logic, including ViewModels, Views (AXAML), and shared interfaces. This is the main project where most of our application logic resides. * **SimplePlayerMVVM.Android**: The Android-specific head project. * **SimplePlayerMVVM.Desktop**: The desktop-specific head project (Windows, macOS, Linux). * **SimplePlayerMVVM.iOS**: The iOS-specific head project. `[SCREENSHOT: Solution structure in the IDE]` ### 2.2. Core Project (`SimplePlayerMVVM.csproj`) The main project, `SimplePlayerMVVM.csproj`, targets multiple platforms. Key package references include: * `Avalonia`: The core Avalonia UI framework. * `Avalonia.Themes.Fluent`: Provides a Fluent Design theme. * `Avalonia.ReactiveUI`: For MVVM support using ReactiveUI. * `VisioForge.DotNet.MediaBlocks`: Core VisioForge media processing components. * `VisioForge.DotNet.Core.UI.Avalonia`: VisioForge UI components for Avalonia, including the `VideoView`. ```xml enable latest true net8.0-android;net8.0-ios;net8.0-windows net8.0-android;net8.0-ios;net8.0-macos14.0 net8.0-android;net8.0 ``` This setup allows the core logic to be shared across all target platforms. ### 2.3. Platform-Specific Projects Each platform head project (`SimplePlayerMVVM.Android.csproj`, `SimplePlayerMVVM.Desktop.csproj`, `SimplePlayerMVVM.iOS.csproj`) includes platform-specific dependencies and configurations. **Desktop (`SimplePlayerMVVM.Desktop.csproj`):** * References `Avalonia.Desktop`. * Includes platform-specific VisioForge native libraries (e.g., `VisioForge.CrossPlatform.Core.Windows.x64`, `VisioForge.CrossPlatform.Core.macOS`). ```xml net8.0-windows WinExe net8.0-macos14.0 Exe net8.0 Exe ``` **Android (`SimplePlayerMVVM.Android.csproj`):** * References `Avalonia.Android`. * Includes Android-specific VisioForge libraries and dependencies like `VisioForge.CrossPlatform.Core.Android`. ```xml Exe net8.0-android 21 enable com.CompanyName.Simple_Player_MVVM 1 1.0 apk false ``` **iOS (`SimplePlayerMVVM.iOS.csproj`):** * References `Avalonia.iOS`. * Includes iOS-specific VisioForge libraries like `VisioForge.CrossPlatform.Core.iOS`. ```xml Exe net8.0-ios 13.0 enable Simple_Player_MVVM.iOS com.visioforge.avaloniaplayer ``` These project files are crucial for managing dependencies and build configurations for each platform. ## 3. Core MVVM Structure The application follows the MVVM pattern, separating UI (Views) from logic (ViewModels) and data (Models). ReactiveUI is used to facilitate this pattern. ### 3.1. `ViewModelBase.cs` This abstract class serves as the base for all ViewModels in the application. It inherits from `ReactiveObject`, which is part of ReactiveUI and provides the necessary infrastructure for property change notifications. ```csharp using ReactiveUI; namespace Simple_Player_MVVM.ViewModels { public abstract class ViewModelBase : ReactiveObject { } } ``` Any ViewModel that needs to notify the UI of property changes should inherit from `ViewModelBase`. `[SCREENSHOT: ViewModelBase.cs code]` ### 3.2. `ViewLocator.cs` The `ViewLocator` class is responsible for locating and instantiating Views based on the type of their corresponding ViewModel. It implements Avalonia's `IDataTemplate` interface. ```csharp using Avalonia.Controls; using Avalonia.Controls.Templates; using Simple_Player_MVVM.ViewModels; using System; namespace Simple_Player_MVVM { public class ViewLocator : IDataTemplate { public Control? Build(object? data) { if (data is null) return null; var name = data.GetType().FullName!.Replace("ViewModel", "View", StringComparison.Ordinal); var type = Type.GetType(name); if (type != null) { return (Control)Activator.CreateInstance(type)!; } return new TextBlock { Text = "Not Found: " + name }; } public bool Match(object? data) { return data is ViewModelBase; } } } ``` When Avalonia needs to display a ViewModel, the `ViewLocator`'s `Match` method checks if the data object is a `ViewModelBase`. If it is, the `Build` method attempts to find a corresponding View by replacing "ViewModel" with "View" in the ViewModel's class name and instantiates it. This convention-based approach simplifies the association between Views and ViewModels. `[SCREENSHOT: ViewLocator.cs code]` ### 3.3. Application Initialization (`App.axaml` and `App.axaml.cs`) The `App.axaml` file defines the application-level resources, including the `ViewLocator` as a data template and the theme (FluentTheme). **`App.axaml`**: ```xml ``` **`App.axaml.cs`**: The `App.axaml.cs` file handles the application's initialization and lifecycle. Key responsibilities in `OnFrameworkInitializationCompleted`: 1. Creates an instance of `MainViewModel`. 2. Sets up the main window or view based on the application lifetime (`IClassicDesktopStyleApplicationLifetime` for desktop, `ISingleViewApplicationLifetime` for mobile/web-like views). 3. Assigns the `MainViewModel` instance as the `DataContext` for the main window/view. 4. Retrieves the `IVideoView` instance from the `MainView` (hosted within `MainWindow` or directly as `MainView`). 5. Passes the `IVideoView` and the `TopLevel` control (necessary for file dialogs and other top-level interactions) to the `MainViewModel`. ```csharp using Avalonia; using Avalonia.Controls; using Avalonia.Controls.ApplicationLifetimes; using Avalonia.Markup.Xaml; using Simple_Player_MVVM.ViewModels; using Simple_Player_MVVM.Views; using VisioForge.Core.Types; namespace Simple_Player_MVVM { public partial class App : Application { public override void Initialize() { AvaloniaXamlLoader.Load(this); } public override void OnFrameworkInitializationCompleted() { IVideoView videoView = null; var model = new MainViewModel(); if (ApplicationLifetime is IClassicDesktopStyleApplicationLifetime desktop) { desktop.MainWindow = new MainWindow { DataContext = model }; videoView = (desktop.MainWindow as MainWindow).GetVideoView(); model.VideoViewIntf = videoView; model.TopLevel = desktop.MainWindow; } else if (ApplicationLifetime is ISingleViewApplicationLifetime singleViewPlatform) { singleViewPlatform.MainView = new MainView { DataContext = model }; videoView = (singleViewPlatform.MainView as MainView).GetVideoView(); model.VideoViewIntf = videoView; model.TopLevel = TopLevel.GetTopLevel(singleViewPlatform.MainView); } base.OnFrameworkInitializationCompleted(); } } } ``` This setup ensures that the `MainViewModel` has access to the necessary UI components for video playback and interaction, regardless of the platform. `[SCREENSHOT: App.axaml.cs code focusing on OnFrameworkInitializationCompleted]` ## 4. MainViewModel Implementation (`MainViewModel.cs`) The `MainViewModel` is central to the media player's functionality. It manages the player's state, handles user interactions, and communicates with the VisioForge `MediaPlayerCoreX` engine. `[SCREENSHOT: MainViewModel.cs overall structure or class definition]` Key components of `MainViewModel`: ### 4.1. Properties for UI Binding The ViewModel exposes several properties that are bound to UI elements in `MainView.axaml`. These properties use `ReactiveUI`'s `RaiseAndSetIfChanged` to notify the UI of changes. * **`VideoViewIntf` (IVideoView):** A reference to the `VideoView` control in the UI, passed from `App.axaml.cs`. * **`TopLevel` (TopLevel):** A reference to the top-level control, used for displaying file dialogs. * **`Position` (string?):** Current playback position (e.g., "00:01:23"). * **`Duration` (string?):** Total duration of the media file (e.g., "00:05:00"). * **`Filename` (string? or Foundation.NSUrl? for iOS):** The name or path of the currently loaded file. * **`VolumeValue` (double?):** Current volume level (0-100). * **`PlayPauseText` (string?):** Text for the Play/Pause button (e.g., "PLAY" or "PAUSE"). * **`SpeedText` (string?):** Text indicating the current playback speed (e.g., "SPEED: 1X"). * **`SeekingValue` (double?):** Current value of the seeking slider. * **`SeekingMaximum` (double?):** Maximum value of the seeking slider (corresponds to media duration in milliseconds). ```csharp // Example property private string? _Position = "00:00:00"; public string? Position { get => _Position; set => this.RaiseAndSetIfChanged(ref _Position, value); } // ... other properties ... ``` ### 4.2. Commands for UI Interactions ReactiveUI `ReactiveCommand` instances are used to handle actions triggered by UI elements (e.g., button clicks, slider value changes). * **`OpenFileCommand`:** Opens a file dialog to select a media file. * **`PlayPauseCommand`:** Plays or pauses the media. * **`StopCommand`:** Stops playback. * **`SpeedCommand`:** Cycles through playback speeds (1x, 2x, 0.5x). * **`VolumeValueChangedCommand`:** Updates the player volume when the volume slider changes. * **`SeekingValueChangedCommand`:** Seeks to a new position when the seeking slider changes. * **`WindowClosingCommand`:** Handles cleanup when the application window is closing. ```csharp // Constructor - Command initialization public MainViewModel() { OpenFileCommand = ReactiveCommand.Create(OpenFileAsync); PlayPauseCommand = ReactiveCommand.CreateFromTask(PlayPauseAsync); StopCommand = ReactiveCommand.CreateFromTask(StopAsync); // ... other command initializations ... // Subscribe to property changes to trigger commands for sliders this.WhenAnyValue(x => x.VolumeValue).Subscribe(_ => VolumeValueChangedCommand.Execute().Subscribe()); this.WhenAnyValue(x => x.SeekingValue).Subscribe(_ => SeekingValueChangedCommand.Execute().Subscribe()); _tmPosition = new System.Timers.Timer(1000); // Timer for position updates _tmPosition.Elapsed += tmPosition_Elapsed; VisioForgeX.InitSDK(); // Initialize VisioForge SDK } ``` Note: `VisioForgeX.InitSDK()` initializes the VisioForge SDK. This should be called once at application startup. ### 4.3. VisioForge `MediaPlayerCoreX` Integration A private field `_player` of type `MediaPlayerCoreX` holds the instance of the VisioForge media player engine. ```csharp private MediaPlayerCoreX _player; ``` ### 4.4. Engine Creation (`CreateEngineAsync`) This asynchronous method initializes or re-initializes the `MediaPlayerCoreX` instance. ```csharp private async Task CreateEngineAsync() { if (_player != null) { await _player.StopAsync(); await _player.DisposeAsync(); } _player = new MediaPlayerCoreX(VideoViewIntf); // Pass the Avalonia VideoView _player.OnError += _player_OnError; // Subscribe to error events _player.Audio_Play = true; // Ensure audio is enabled // Create source settings from the filename var sourceSettings = await UniversalSourceSettings.CreateAsync(Filename); await _player.OpenAsync(sourceSettings); } ``` Key steps: 1. Disposes of any existing player instance. 2. Creates a new `MediaPlayerCoreX`, passing the `IVideoView` from the UI. 3. Subscribes to the `OnError` event for error handling. 4. Sets `Audio_Play = true` to enable audio playback by default. 5. Uses `UniversalSourceSettings.CreateAsync(Filename)` to create source settings appropriate for the selected file. 6. Opens the media source using `_player.OpenAsync(sourceSettings)`. `[SCREENSHOT: CreateEngineAsync method code]` ### 4.5. File Opening (`OpenFileAsync`) This method is responsible for allowing the user to select a media file. ```csharp private async Task OpenFileAsync() { await StopAllAsync(); // Stop any current playback PlayPauseText = "PLAY"; #if __IOS__ && !__MACCATALYST__ // iOS specific: Use IDocumentPickerService var filePicker = Locator.Current.GetService(); var res = await filePicker.PickVideoAsync(); if (res != null) { Filename = (Foundation.NSUrl)res; var access = IOSHelper.CheckFileAccess(Filename); // Helper to check file access if (!access) { IOSHelper.ShowToast("File access error"); return; } } #else // Other platforms: Use Avalonia's StorageProvider try { var files = await TopLevel.StorageProvider.OpenFilePickerAsync(new FilePickerOpenOptions { Title = "Open video file", AllowMultiple = false }); if (files.Count >= 1) { var file = files[0]; Filename = file.Path.AbsoluteUri; #if __ANDROID__ // Android specific: Convert content URI to file path if necessary if (!Filename.StartsWith('/')) { Filename = global::VisioForge.Core.UI.Android.FileDialogHelper.GetFilePathFromUri(AndroidHelper.GetContext(), file.Path); } #endif } } catch (Exception ex) { // Handle cancellation or errors Debug.WriteLine($"File open error: {ex.Message}"); } #endif } ``` Platform-specific considerations: * **iOS:** Uses an `IDocumentPickerService` (resolved via `Locator.Current.GetService`) to present the native document picker. `IOSHelper.CheckFileAccess` is used to ensure the app has permission to access the selected file. The filename is stored as an `NSUrl`. * **Android:** If the path obtained from the file picker is a content URI, `FileDialogHelper.GetFilePathFromUri` (from `VisioForge.Core.UI.Android`) is used to convert it to an actual file path. This requires an `IAndroidHelper` to get the Android context. * **Desktop/Other:** Uses `TopLevel.StorageProvider.OpenFilePickerAsync` for the standard Avalonia file dialog. `[SCREENSHOT: OpenFileAsync method with platform-specific blocks highlighted]` ### 4.6. Playback Controls * **`PlayPauseAsync`:** * If the player is not initialized or stopped (`PlaybackState.Free`), it calls `CreateEngineAsync` and then `_player.PlayAsync()`. * If playing (`PlaybackState.Play`), it calls `_player.PauseAsync()`. * If paused (`PlaybackState.Pause`), it calls `_player.ResumeAsync()`. * Updates `PlayPauseText` accordingly and starts/stops the `_tmPosition` timer. ```csharp private async Task PlayPauseAsync() { // ... (null/empty filename check) ... if (_player == null || _player.State == PlaybackState.Free) { await CreateEngineAsync(); await _player.PlayAsync(); _tmPosition.Start(); PlayPauseText = "PAUSE"; } else if (_player.State == PlaybackState.Play) { await _player.PauseAsync(); PlayPauseText = "PLAY"; } else if (_player.State == PlaybackState.Pause) { await _player.ResumeAsync(); PlayPauseText = "PAUSE"; } } ``` * **`StopAsync`:** * Calls `StopAllAsync` to stop the player and reset UI elements. * Resets `SpeedText` and `PlayPauseText`. ```csharp private async Task StopAsync() { await StopAllAsync(); SpeedText = "SPEED: 1X"; PlayPauseText = "PLAY"; } ``` * **`StopAllAsync` (Helper):** * Stops the `_tmPosition` timer. * Calls `_player.StopAsync()`. * Resets `SeekingMaximum` to null (so it gets re-calculated on next play). ```csharp private async Task StopAllAsync() { if (_player == null) return; _tmPosition.Stop(); if (_player != null) await _player.StopAsync(); await Task.Delay(300); // Small delay to ensure stop completes SeekingMaximum = null; } ``` ### 4.7. Playback Speed (`SpeedAsync`) Cycles through playback rates: 1.0, 2.0, and 0.5. ```csharp private async Task SpeedAsync() { if (SpeedText == "SPEED: 1X") { SpeedText = "SPEED: 2X"; await _player.Rate_SetAsync(2.0); } else if (SpeedText == "SPEED: 2X") { SpeedText = "SPEED: 0.5X"; await _player.Rate_SetAsync(0.5); } else if (SpeedText == "SPEED: 0.5X") // Assumes this was the previous state { SpeedText = "SPEED: 1X"; await _player.Rate_SetAsync(1.0); } } ``` Uses `_player.Rate_SetAsync(double rate)` to change the playback speed. ### 4.8. Position and Duration Updates (`tmPosition_Elapsed`) This method is called by the `_tmPosition` timer (typically every second) to update the UI with the current playback position and duration. ```csharp private async void tmPosition_Elapsed(object sender, System.Timers.ElapsedEventArgs e) { if (_player == null) return; var pos = await _player.Position_GetAsync(); var progress = (int)pos.TotalMilliseconds; try { await Dispatcher.UIThread.InvokeAsync(async () => { if (_player == null) return; _isTimerUpdate = true; // Flag to prevent seeking loop if (SeekingMaximum == null) { SeekingMaximum = (int)(await _player.DurationAsync()).TotalMilliseconds; } SeekingValue = Math.Min(progress, (int)(SeekingMaximum ?? progress)); Position = $"{pos.ToString(@"hh\:mm\:ss", CultureInfo.InvariantCulture)}"; Duration = $"{(await _player.DurationAsync()).ToString(@"hh\:mm\:ss", CultureInfo.InvariantCulture)}"; _isTimerUpdate = false; }); } catch (Exception exception) { System.Diagnostics.Debug.WriteLine(exception); } } ``` Key actions: 1. Retrieves current position (`_player.Position_GetAsync()`) and duration (`_player.DurationAsync()`). 2. Updates `SeekingMaximum` if it hasn't been set yet (usually after a file is opened). 3. Updates `SeekingValue` with the current progress. 4. Formats and updates `Position` and `Duration` strings. 5. Uses `Dispatcher.UIThread.InvokeAsync` to ensure UI updates happen on the UI thread. 6. Sets `_isTimerUpdate = true` before updating `SeekingValue` and `false` after, to prevent the `OnSeekingValueChanged` handler from re-seeking when the timer updates the slider position. `[SCREENSHOT: tmPosition_Elapsed method]` ### 4.9. Seeking (`OnSeekingValueChanged`) Called when the `SeekingValue` property changes (i.e., the user moves the seeking slider). ```csharp private async Task OnSeekingValueChanged() { if (!_isTimerUpdate && _player != null && SeekingValue.HasValue) { await _player.Position_SetAsync(TimeSpan.FromMilliseconds(SeekingValue.Value)); } } ``` If not currently being updated by the timer (`!_isTimerUpdate`), it calls `_player.Position_SetAsync()` to seek to the new position. ### 4.10. Volume Control (`OnVolumeValueChanged`) Called when the `VolumeValue` property changes (i.e., the user moves the volume slider). ```csharp private void OnVolumeValueChanged() { if (_player != null && VolumeValue.HasValue) { // Volume for MediaPlayerCoreX is 0.0 to 1.0 _player.Audio_OutputDevice_Volume = VolumeValue.Value / 100.0; } } ``` Sets `_player.Audio_OutputDevice_Volume`. Note that the ViewModel uses a 0-100 scale for `VolumeValue`, while the player expects 0.0-1.0. ### 4.11. Error Handling (`_player_OnError`) A simple error handler that logs errors to the debug console. ```csharp private void _player_OnError(object sender, VisioForge.Core.Types.Events.ErrorsEventArgs e) { Debug.WriteLine(e.Message); } ``` More sophisticated error handling (e.g., showing a message to the user) could be implemented here. ### 4.12. Resource Cleanup (`OnWindowClosing`) This method is invoked when the main window is closing. It ensures that VisioForge SDK resources are properly released. ```csharp private void OnWindowClosing() { if (_player != null) { _player.OnError -= _player_OnError; // Unsubscribe from events _player.Stop(); // Ensure player is stopped (sync version here for quick cleanup) _player.Dispose(); _player = null; } VisioForgeX.DestroySDK(); // Destroy VisioForge SDK instance } ``` It stops the player, disposes of it, and importantly, calls `VisioForgeX.DestroySDK()` to release all SDK resources. This is crucial to prevent memory leaks or issues when the application exits. This ViewModel orchestrates all the core logic of the media player, from loading files to controlling playback and interacting with the VisioForge SDK. ## 5. User Interface (Views) The user interface is defined using Avalonia XAML (`.axaml` files). ### 5.1. `MainView.axaml` - The Player Interface This `UserControl` defines the layout and controls for the media player. `[SCREENSHOT: MainView.axaml rendered UI design]` **Key UI Elements:** * **`avalonia:VideoView`:** This is the VisioForge control responsible for rendering video. It's placed in the main area of the grid and set to stretch. ```xml ``` * **Seeking Slider (`Slider Name="slSeeking"`):** * `Maximum="{Binding SeekingMaximum}"`: Binds to the `SeekingMaximum` property in `MainViewModel`. * `Value="{Binding SeekingValue}"`: Binds two-way to the `SeekingValue` property in `MainViewModel`. Changes to this slider by the user will update `SeekingValue`, triggering `OnSeekingValueChanged`. Updates to `SeekingValue` from the ViewModel (e.g., by the timer) will update the slider's position. * **Time Display (`TextBlock`s for Position and Duration):** * Bound to `Position` and `Duration` properties in `MainViewModel`. * `TextBlock Text="{Binding Filename}"` displays the current file name. * **Playback Control Buttons (`Button`s):** * **Open File:** `Command="{Binding OpenFileCommand}"` * **Play/Pause:** `Command="{Binding PlayPauseCommand}"`, `Content="{Binding PlayPauseText}"` (dynamically changes button text). * **Stop:** `Command="{Binding StopCommand}"` * **Volume and Speed Controls:** * **Volume Slider:** `Value="{Binding VolumeValue}"` (binds to `VolumeValue` for volume control). * **Speed Button:** `Command="{Binding SpeedCommand}"`, `Content="{Binding SpeedText}"`. **Layout:** The view uses a `Grid` to arrange the `VideoView` and a `StackPanel` for the controls at the bottom. The controls themselves are organized using nested `StackPanel`s and `Grid`s for alignment. ```xml