Quantcast
Channel: Questions in topic: "ios"
Viewing all articles
Browse latest Browse all 4709

texture rendering issue on iOS using OpenGL ES in Unity project

$
0
0
I'm working on a project, part of it is to streaming video to my iPhone, currently I use my laptop to create the video stream to my iPhone with ffmpeg.The stream code in shell is below: ffmpeg \ -f avfoundation -i "1" -s 1280*720 -r 29.97 \ -c:v mpeg2video -q:v 20 -pix_fmt yuv420p -g 1 -threads 4\ -f mpegts udp://192.168.1.102:6666 with this, I successfully create my video stream. In Unity, I want to decode the video stream to create a texture. After I have gone through with some ffmpeg tutorial and Unity tutorial(Since I'm new to both of them), I followed tutorials and created my link library. Some of these codes are below(ask me if more is needed): In my C++ library: buffer alloc: uint8_t *buffer; int buffer_size; buffer_size = avpicture_get_size(AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT); buffer = (uint8_t *) av_malloc(buffer_size*sizeof(uint8_t)); avpicture_fill((AVPicture *) pFrameRGB, buffer, AV_PIX_FMT_RGBA, VIEW_WIDTH, VIEW_HEIGHT); getContext: is->sws_ctx = sws_getContext ( is->video_st->codec->width, is->video_st->codec->height, is->video_st->codec->pix_fmt, VIEW_WIDTH, VIEW_HEIGHT, AV_PIX_FMT_RGBA, SWS_BILINEAR, NULL, NULL, NULL ); sws_scale: sws_scale( is->sws_ctx, (uint8_t const * const *)pFrame->data, pFrame->linesize, 0, is->video_st->codec->height, pFrameRGB->data, pFrameRGB->linesize ); texture render: static void UNITY_INTERFACE_API OnRenderEvent(int texID) { GLuint gltex = (GLuint)(size_t)(texID); glBindTexture(GL_TEXTURE_2D, gltex); glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, VIEW_WIDTH, VIEW_HEIGHT, GL_RGBA, GL_UNSIGNED_BYTE, pFrameRGB->data[0]); glGetError(); return; } extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API GetRenderEventFunc() { return OnRenderEvent; } In Unity: texture created: private Texture2D texture; private int texID; texture = new Texture2D (width, height, TextureFormat.RGBA32, false); texture.filterMode = FilterMode.Point; texture.Apply (); GetComponent ().material.mainTexture = texture; texID = texture.GetNativeTexturePtr ().ToInt32(); update func: void Update () { GL.IssuePluginEvent(GetRenderEventFunc(), texID); } Video stream info: Input #0, mpegts, from 'udp://0.0.0.0:6666': Duration: N/A, start: 2.534467, bitrate: N/A Program 1 Metadata: service_name : Service01 service_provider: FFmpeg Stream #0:0[0x100]: Video: mpeg2video (Main) ([2][0][0][0] / 0x0002), yuv420p(tv), 1280x720 [SAR 1:1 DAR 16:9], max. 104857 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc Leave other details, my library works fine on the Unity simulator, I can get texture updated correctly after start video streaming, but when I compiled all my libraries for arm64 and used the xcode project that Unity created to build my app and ran it on iPhone6, I couldn't get any texture rendered in my iPhone, I checkd my network and I'm sure that data had been sent to my iPhone and the Debug log showed me that data has been successfully decoded also the OnRenderEvent func had been called. I'm confused and try to find answers, maybe I'm a beginer cause I can't find answers, so I ask you guys to help me plz. FYI: Unity 5.3.2f1 Personal Xcode 7.2.1 iOS 9.2.1 ffmpeg 3.0

Viewing all articles
Browse latest Browse all 4709

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>