Real-time Camera Streaming

833 Views Asked by At

I'm trying to develop an application to do real-time streaming video and audio monitoring service.

Capturing the camera from an iPhone, iPad, or iPod and showing it in real time on another device or devices such as an iPhone, iPad, or iPod.

At the same time, I want to allow audio communication and data transfer between the 2 (or more) devices simultaneously.

As far as I know, Bonjour, GameKit, and NSStream allow data transfer between devices, via wifi or Bluetooth.

Until now, I have used the AVCaptureSession (from AVFoundation) to capture every frame from the camera and send it using the GKSession (from GameKit), to another device (Bluetooth or wifi).

Here is the main process to send frames from the camera capturing:

    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:   (CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
    {       
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);

        // Get the number of bytes per row for the pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);

        // Get the number of bytes per row for the pixel buffer
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);

        // Get the pixel buffer width and height
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);

        // Create a device-dependent RGB color space
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

        // Create a bitmap graphics context with the sample buffer data
        CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

        // Create a Quartz image from the pixel data in the bitmap graphics context
        CGImageRef quartzImage = CGBitmapContextCreateImage(context);

        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);

        // Free up the context and color space
        CGContextRelease(context);
        CGColorSpaceRelease(colorSpace);

        // Create an image object from the Quartz image
        UIImage *image = [UIImage imageWithCGImage:quartzImage];

        NSData *data = UIImageJPEGRepresentation(image, 0.0);

        // Release the Quartz image
        CGImageRelease(quartzImage);

        // Send image data to peers, using GKSession from GameKit
        [self.gkSession sendDataToAllPeers:data withDataMode:GKSendDataReliable error:nil];

        CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    }

And here is the other end (where i display the received frames in a UIImageView):

    - (void)receiveData:(NSData *)data fromPeer:(NSString *)peer inSession:(GKSession *)session context:(void *)context
    {
        // Receive image data from peers
        self.imageView.image = [UIImage imageWithData:data];
    }

The problem I am finding is that GameKit doesn’t give me the results I need.

If I want to transfer the camera with a mild performance (15 fps ), the quality of the image within the sender is really terrible and the receiver end is even worse.

If I want to transfer with a normal or great quality via camera, I have only 0.2 fps on the receiver side (only 1 image every 5 seconds)

I wanted to stream the camera in real time with the best image quality and with the most fps as possible, in order to have a smooth and detailed real-time camera streaming, at the same time having audio communication and data transfer.

I would like to know what is the best way to achieve this: If I need to be registered to a server or not or if I need another type of software or application?

Summarizing this for you, the features I want to use within the App (any iPhone, iPad or iPod):

Sending OR receiving camera with the best quality and performance ( facial detection will be used as our main feature)

Sending AND receiving microphone with the best audio quality.

Sending OR receiving another type of data (e.g. NSStrings, BOOLs, NSArrays, C Arrays, Loops.

What would be the very best way to achieve these features and which tools do I need to use to achieve them?

Thank you very much in advance

Pedro Monteverde

0

There are 0 best solutions below