tag 标签: camera

相关博文
  • 热度 2
    2019-2-27 16:36
    994 次阅读|
    0 个评论
    作者:Toradex秦海   1. 简介   随着嵌入式处理器性能的提升,嵌入式设备在安全,机器视觉等方面的应用也越来越广发,本文就基于NXP iMX6 ARM处理器演示多种摄像头的连接应用,iMX6处理器支持多种摄像头接口连接,比如Parallel Camera Interface,MIPI/CSI-2 串行摄像头接口,USB接口,网络接口等。   本文所演示的ARM平台来自于Toradex基于NXP iMX6Q ARM处理器的Apalis iMX6Q ARM嵌入式平台。   2. 准备   a). Apalis iMX6Q ARM核心版配合Ixora载板,连接调试串口UART1(载板X22)到开发主机方便调试,连接HDMI显示器用于摄像头输出显示。更多关于Apalis iMX6配合Ixora载板的说明请参考Datasheet和开发上手指南。   b). 本文测试所使用的摄像头如下:   ./ Toradex OV5640 MIPI/CSI-2 摄像头模块,连接到载板X28   ./ 海康模拟摄像头,通过Toradex ACA模块连接到载板X14   ./ FHD USB 摄像头,连接到载板X8 USB接口   ./ 海康萤石 CS-C3C-1FR网络摄像头,通过网线连接到载板X11网口   c). Apalis iMX6Q 安装Toradex 最新Linux BSP V2.8b5,系统中已经包含了本文测试所需要的Gstreamer相关组件,详细安装方法请参考这里。   ./ 关于OV5640 摄像头Linux下配置使用说明请参考这里,本文不再赘述。   ./ 关于通过ACA模块连接模拟摄像头的配置使用说明请参考这里,本文不再赘述。   3. FHD USB摄像头测试   a). 先单独进行FHD USB摄像头模块测试,所使用的Camera Sensor为Aptina MI5100 (注:已经被OnSemi收购,对应指标产品请到OnSemi网站搜索),连接到Apalis iMX6会自动配置UVC驱动,无需另行配置。   b). 首先使用 V4L2命令查看摄像头支持的格式,如下所示,此摄像头可以支持MJPG格式和RAW格式输出,不过对于800x600以上的高分辨率输出,RAW模式只能以非常低的帧率输出,因此对于本文测试的1080P输出就要使用MJPG格式输出了。   -----------------------   root@apalis-imx6:~# v4l2-ctl --device=/dev/video3 --list-formats-ext   ioctl: VIDIOC_ENUM_FMT   Index : 0   Type : Video Capture   Pixel Format: 'MJPG' (compressed)   Name : Motion-JPEG   Size: Discrete 1600x1200   Interval: Discrete 0.067s (15.000 fps)   Size: Discrete 2592x1944   Interval: Discrete 0.067s (15.000 fps)   Size: Discrete 2048x1536   Interval: Discrete 0.067s (15.000 fps)   Size: Discrete 1920x1080   Interval: Discrete 0.033s (30.000 fps)   Size: Discrete 1280x1024   Interval: Discrete 0.067s (15.000 fps)   Size: Discrete 1280x720   Interval: Discrete 0.033s (30.000 fps)   Size: Discrete 1024x768   Interval: Discrete 0.033s (30.000 fps)   Size: Discrete 800x600   Interval: Discrete 0.033s (30.000 fps)   Size: Discrete 640x480   Interval: Discrete 0.033s (30.000 fps)   Size: Discrete 1600x1200   Interval: Discrete 0.067s (15.000 fps)   Index: 1   Type: Video Capture   Pixel Format: 'YUYV'   Name: YUYV 4:2:2   Size: Discrete 1600x1200   Interval: Discrete 0.200s (5.000 fps)   Size: Discrete 2592x1944   Interval: Discrete 0.333s (3.000 fps)   Size: Discrete 2048x1536   Interval: Discrete 0.250s (4.000 fps)   Size: Discrete 1920x1080   Interval: Discrete 0.200s (5.000 fps)   Size: Discrete 1280x1024   Interval: Discrete 0.111s (9.000 fps)   Size: Discrete 1280x720   Interval: Discrete 0.200s (5.000 fps)   Size: Discrete 1024x768   Interval: Discrete 0.100s (10.000 fps)   Size: Discrete 800x600   Interval: Discrete 0.050s (20.000 fps)   Size: Discrete 640x480   Interval: Discrete 0.033s (30.000 fps)   Size: Discrete 1600x1200   Interval: Discrete 0.200s (5.000 fps)   -----------------------   c). 测试基本的摄像头capture并全屏显示   -----------------------   // RAW 格式   $ gst-launch-1.0 imxv4l2src device=/dev/video3 ! 'video/x-raw, framerate=5/1, width=(int)1920, height=(int)1080, format=(string)YUY2' ! imxv4l2sink   CPU占用率:~40% x 1core of 4cores   // MJPG 格式   $ gst-launch-1.0 v4l2src device=/dev/video3 ! 'image/jpeg,width=1920,height=1080,framerate=30/1' ! vpudec output-format=4 ! imxipuvideotransform ! imxg2dvideosink sync=false   CPU占用率:~50% x 1core of 4cores   -----------------------   为了尽可能降低CPU的占用率,我们尽可能的使用iMX gstreamer专用的元件,可以调用VPU/GPU等硬件资源来加速,查询所有iMX相关原件可以通过下面命令   $ gst-inspect-1.0 |grep imx   d). 针对视频监控的场景往往需要在捕获的摄像头视频流上面打上时间戳,在播放的同时也保存成为文件,下面就测试这个应用场景。   ./ 这里我们使用基于imx硬件加速的imxg2dclockoverlay 元件来实现打时间戳   ./ 打时间戳后单独进行摄像头视频流播放操作   -----------------------   $ gst-launch-1.0 v4l2src device=/dev/video3 ! 'image/jpeg,width=1920,height=1080,framerate=30/1' ! vpudec output-format=4 ! imxipuvideotransform ! imxg2dclockoverlay time-format="%Y/%m/%d %H:%M:%S" ! imxg2dvideosink sync=false   CPU占用率:~49% x 1core of 4cores   -----------------------   ./ 打时间戳后单独进行摄像头视频流保存成文件操作   -----------------------   $ gst-launch-1.0 -vvv v4l2src device=/dev/video3 ! 'image/jpeg,width=1920,height=1080,framerate=30/1' ! vpudec output-format=4 ! imxipuvideotransform ! imxg2dclockoverlay time-format="%Y/%m/%d %H:%M:%S" ! imxipuvideotransform ! 'video/x-raw,width=1920,height=1080,framerate=30/1' ! imxvpuenc_h264 bitrate=8000 ! filesink location=test.mp4   CPU占用率:~30% x 1core of 4cores   -----------------------   ./ 打时间戳后同时进行摄像头视频流保存成文件和播放操作   -----------------------   $ gst-launch-1.0 -vvv v4l2src device=/dev/video3 ! 'image/jpeg,width=1920,height=1080,framerate=30/1' ! vpudec output-format=4 ! imxipuvideotransform ! imxg2dclockoverlay time-format="%Y/%m/%d %H:%M:%S" ! tee name=splitter ! queue ! imxipuvideotransform ! 'video/x-raw,width=1920,height=1080,framerate=30/1' ! imxvpuenc_h264 bitrate=8000 ! filesink location=test.mp4 splitter. ! queue ! imxg2dvideosink sync=false   CPU占用率:~53% x 1core of 4cores   -----------------------   ./ 录制下来的 test.mp4文件可以通过下面pipeline进行回放   -----------------------   gst-launch-1.0 filesrc location=/home/root/test.mp4 typefind=true ! h264parse ! vpudec ! imxv4l2sink   -----------------------   4. 海康网络摄像头测试   a). 首先通过海康萤石管理软件配置摄像头网络配置为固定IP - 10.20.1.99/255.255.255.0   b). 然后将Apalis iMX6 eth0 网络配置为固定IP - 10.20.1.98/255.255.255.0,然后和摄像头确保能够ping通。   c). 通过下面pipeline通过rtsp捕获摄像头适配流并播放,admin对应的password在每个摄像头标签上面有提供。   -----------------------   $ gst-launch-1.0 rtspsrc location=rtsp://admin:password@10.20.1.99:554/h264/ch1/main/av_stream latency=10 ! queue ! rtph264depay ! vpudec ! imxg2dvideosink   CPU占用率:~18% x 1core of 4cores   -----------------------   当然,也同样可以将捕获的视频流保存成本地文件,这里就不赘述了。   5. 四路摄像头同时播放示例   a). 基于上面两个章节的测试后,我们现在来进行四路摄像头同时播放显示测试,当然此时显示窗口就不能全屏了,而是要对应调整大小和位置。   b). 首先测试在屏幕左上角显示Full HD USB摄像头输出   -----------------------   $ gst-launch-1.0 v4l2src device=/dev/video3 ! 'image/jpeg,width=1920,height=1080,framerate=30/1' ! vpudec output-format=4 ! imxipuvideotransform ! imxg2dclockoverlay time-format="%Y-%m-%d %H:%M:%S" halignment=2 valignment=1 ! imxg2dvideosink sync=false window-width=960 window-height=480   -----------------------   c). 然后测试在屏幕左下角显示网络摄像头输出   -----------------------   $ gst-launch-1.0 rtspsrc location=rtsp://admin:XIYFYO@10.20.1.99:554/h264/ch1/main/av_stream latency=10 ! queue ! rtph264depay ! vpudec ! imxg2dvideosink window-width=960 window-height=480 window-y-coord=480   -----------------------   d). 然后测试在屏幕右上角显示OV5640 MIPI/CSI-2摄像头输出   -----------------------   $ gst-launch-1.0 imxv4l2src device=/dev/video2 ! capsfilter caps="video/x-raw, width=1920, height=1080, framerate=30/1" ! imxipuvideotransform ! imxg2dclockoverlay time-format="%Y-%m-%d %H:%M:%S" halignment=2 valignment=1 ! imxg2dvideosink sync=false window-width=960 window-height=480 window-x-coord=960   -----------------------   e). 然后测试在屏幕右下角显示模拟摄像头输出   -----------------------   $ gst-launch-1.0 -v imxv4l2videosrc ! imxipuvideotransform ! imxg2dclockoverlay time-format="%Y-%m-%d %H:%M:%S" halignment=2 valignment=1 ! imxg2dvideosink window-width=960 window-height=480 window-x-coord=960 window-y-coord=480   -----------------------   f). 最后测试上面四路摄像头同时输出   本文用如下测试脚本进行测试,当然也可以将上面pipeline都在一个gst-launch里面运行,不过可能需要适当增加queue。   -----------------------   CPU占用率:~50% x 3core of 4cores   -----------------------   显示效果如下图,另外需要注意由于这个测试除了占用比较大的CPU资源,也大量使用了VPU,GPU等硬件资源,因此需要考虑对Apalis iMX6进行良好散热才能保证运行稳定。   6. 总结   如上述示例,iMX6处理器已经可以支持丰富的摄像头资源,为了保证工作高效和稳定,在可能的情况下,尽量采用IMX专用的gstreamer元件。
  • 热度 4
    2016-5-26 15:15
    685 次阅读|
    0 个评论
    By Toradex 秦海 随着工业产品智能化发展,摄像头作为图像采集在嵌入式设备中需求越来越多,目前常见的摄像头接口有专用的Parallel Camera接口和MIPI CSI接口,以及USB/Ethernet Camera等采用USB, Ethernet等接口,本文就着重展示摄像头专用接口MIPI CSI在嵌入式Linux下的应用. MIPI (Mobile Industry Processor Interface) 摄像头接口标准是由MIPI® Alliance Camera Working Group针对传统并行摄像头存在的问题提出的一个串行高速摄像头接口定义, 目前有MIPI CSI-2和MIPI-CSI-3两个协议定义,本文涉及的均为MIPI CSI-2协议接口, 目前已经从V1.0 发布发展到V1.3发布, D-PHY下传输带宽也从1Gbps/lane提高到2.5Gbps/lane, V1.3新引入的C-PHY则更高,关于MIPI CSI详细说明请参考 这里 .   1).  准备 a). 本文所使用的硬件平台为 Toradex  工业产品级 Apalis i.MX6D 和 T30 ARM 核心板模块,搭配 Apalis Eva Board 开发板和 Apalis T30 Mezzanine  扩展卡 b). MIPI CSI-2摄像头使用基于OV5640的摄像头模块 c). Parallel Camera摄像头使用 Toradex ACM 摄像头(基于ADV7180)模块搭配监控摄像头 d). 软件使用Toradex官方发布的 嵌入式Linux  V2.5 Beta3发布   2).  物理连接示意图     3). Apalis i.MX6D a). 模块支持两路Parallel Camera接口和一路MIPI CSI-2 V1.0接口(1-4 lane) b). V2.5Beta3版本Linux image已经包含OV5640和ADV7180的驱动和相应的device tree  patch ,因此上电后系统可以直接识别摄像头设备/dev/video0(ADV7180)和/dev/video2(OV5640) ----------------- root@apalis-imx6:~# dmesg |grep adv   adv7180 3-0021: no sensor pwdn pin available mxc_v4l2_master_attach: ipu0:/csi0 parallel attached adv7180:mxc_ v4l2_cap0   root@apalis-imx6:~# dmesg |grep ov5640 ov5640_mipi 3-003c: request of ov5640_mipi_reset failed mxc_v4l2_master_attach: ipu0:/csi1 parallel attached ov5640_mipi: mxc_v4l2_cap2 camera ov5640_mipi is found ov5640_set_virtual_channel: virtual channel=1 ov5640_set_virtual_channel: virtual channel=1 ov5640_set_virtual_channel: virtual channel=1 ----------------- c). 运行下面gstreamer pipeline测试MIPI CSI-2摄像头 ----------------- /* 640x480分辨率抓取并播放 framerate=90 */ gst-launch -vvv imxv4l2src device=/dev/video2 capture-mode=0 fps-n=90 ! imxv4l2sink disp-width=640 disp-height=480 /* 720P分辨率抓取并播放 framerate=60 */ gst-launch -vvv imxv4l2src device=/dev/video2 capture-mode=4 fps-n=60 ! imxv4l2sink disp-width=1280 disp-height=720 /* 1080P分辨率抓取并播放 framerate=15 */ gst-launch -vvv imxv4l2src device=/dev/video2 capture-mode=5 fps-n=15 ! imxv4l2sink ----------------- d). 分别对应的CPU占用率如下 ----------------- //640x480 CPU1 - 48% CPU2 - 1% //720P CPU1 - 100% CPU2 - 1% //1080P CPU1 - 100% CPU2 - 1% ----------------- d). 和Parallel Camera摄像头同时显示 ./ 原生ipu driver无法支持两路camera同时显示,需要对image source code 打 patch ,然后按照这里的说明重新编译kernel 和 modules,并部署到Apalis i.MX6模块. ./ 然后运行下面gstreamer pipeline来进行同时显示 ----------------- gst-launch -vvv imxv4l2src device=/dev/video2 capture-mode=0 fps-n=30 ! imxv4l2sink device=/dev/video17 disp-width=640 disp-height=480 gst-launch -vvv tvsrc device=/dev/video0 ! imxv4l2sink device=/dev/video16 disp-width=640 disp-height=480 axis-top=480 ----------------- ./ 显示效果如下 ./ 此时CPU占用率 ----------------- CPU1 60% CPU2 40% -----------------   4). Apalis T30 a). 模块支持两路Parallel Camera接口和一路MIPI CSI-2 V1.0接口(1-4 lane) b). 手动加载驱动模块, 识别出 /dev/video0 (ADV7180)和/dev/video1(OV5640) ----------------- root@apalis-t30:~# modprobe videobuf2-memops root@apalis-t30:~# modprobe videobuf2-dma-nvmap root@apalis-t30:~# modprobe adv7180 root@apalis-t30:~# modprobe ov5640 root@apalis-t30:~# modprobe tegra_v4l2_camera ----------------- c). 运行下面gstreamer pipeline测试MIPI CSI-2摄像头 ----------------- /* 640x480分辨率抓取并播放 framerate=90 */ gst-launch -vvv v4l2src device=/dev/video1 ! 'video/x-raw-yuv, width=640, height=480, framerate=90/1' ! deinterlace tff=1 method=4 ! nv_omx_videomixer ! nv_gl_eglimagesink /* 1296x972分辨率抓取并播放 framerate=60 */ gst-launch -vvv v4l2src device=/dev/video1 ! 'video/x-raw-yuv, width=1296, height=972, framerate=60/1' ! deinterlace tff=1 method=4 ! nv_omx_videomixer ! nv_gl_eglimagesink /* 1920x1088分辨率抓取并播放 framerate=15 */ gst-launch -vvv v4l2src device=/dev/video1 ! 'video/x-raw-yuv, width=1920, height=1088, framerate=15/1' ! deinterlace tff=1 method=4 ! nv_omx_videomixer ! nv_gl_eglimagesink ----------------- d). 分别对应的CPU占用率如下 ----------------- //640x480 CPU1 - 85% CPU2 - 0% CPU3 - 0% CPU4 - 0% //720P CPU1 - 90% CPU2 - 70% CPU3 - 0% CPU4 - 0% //1080P CPU1 - 100% CPU2 - 60% CPU3 - 0% CPU4 - 0% -----------------   5).  总结 随着ARM处理器性能越来越强大,高质量的摄像头解决方案的需求在嵌入式工业领域也会越来越广泛,MIPI CSI无疑是目前最好的方案之一,相信会有广阔的前景。
  • 热度 1
    2016-2-20 00:13
    567 次阅读|
    0 个评论
    简介 本文主要基于ARM嵌入式模块系统展示在嵌入式Linux中使用摄像头示例,所采用的模块为 Toradex VF61 ,是一款性价比极高但不包含硬件视频编解码加速的模块,核心处理器为NXP/Freescale Vybrid,Cortex-A5和M4异构双核架构。 1). 目前越来越多的嵌入式系统采用摄像头应用,其中主要有下面几种方式 远程监控:如闭路电视系统,操作人员通过摄像头远程监控某个特定区域,小到一个小区,达到市政公共场所,都可能有这样的应用。 监控视频录制:另外一些监控系统不一定有操作人员一直监控,则会通过录制监控视频的方式在需要的时候调出相关视频进行查阅。 嵌入式视觉系统:嵌入式视觉系统会对视频图片进行处理并提取更多复杂信息,如雷达和城市智能交通应用。 视频传感器:如临床诊断设备会对采集的视频图像进行分析来诊断,智能购物设备通过采集视频图像分析使用者特征来定向推广销售等等。 2). 环境配置 ./ ARM嵌入式模块系统: Toradex VF61 以及 Colibri Eva board,详细的配置使用手册请见 这里 ./ 摄像头 Logitech HD 720p USB摄像头 D-Link DCS-930L IP 摄像头 ./ 软件: Toradex 标准Embedded Linux发布版本V2.4(已预装),详细介绍请见 这里 GStreamer框架,广泛应用于各种多媒体应用开发,可以实现如视频编辑,媒体流以及媒体播放等多媒体应用,同时配合各种插件(包含输入输出单元,过滤器,编解码器等),GStreamer可以支持多种不同媒体库如MP3,FFmpeg等。所需安装包如下: $ opkg update $ opkg install gst-plugins-base-meta gst-plugins-good-meta gst-ffmpeg 查看目前已经安装的插件和单元 $ gst-inspect   GStreamer 元件 (element) 和管道 (Pipeline) 简介 根据《GStreamer Application Development Manual》章节3所述,元件是GStreamer最重要的对象类,它可以被读取,解码以及显示。管道为多个元件互联一起形成的元件链,可以用于一些特定的任务,如视频播放或捕捉。默认情况下GStreamer 包含大量的元件集以便于开发各种各样的多媒体应用。本文中我们会使用一些管道去展示一些元件的使用。 下图是一个基本的用于Ogg播放的管道示例,使用一个分流器和两个分支,一个处理音频,另一个处理视频。可以看到一些元件只有src 衬垫 (pad),另一些只有sink衬垫或者两者都有。 在连接一个管道前,我们同样需要通过 ”gst-inspect” 命令查看所需的插件是否兼容,如下示例查看ffmpegcolorspace 插件。 $ gst-inspect ffmpegcolorspace 基本信息描述 ----------------------------------------------------------- Factory Details:   Long name: FFMPEG Colorspace converter   Class: Filter/Converter/Video   Description: Converts video from one colorspace to another   Author(s): GStreamer maintainers gstreamer-devel@lists.sourceforge.net ----------------------------------------------------------- Src 和 sink 功能描述 ----------------------------------------------------------- SRC template: 'src'     Availability: Always     Capabilities:       video/x-raw-yuv       video/x-raw-rgb       video/x-raw-gray   SINK template: 'sink'     Availability: Always     Capabilities:       video/x-raw-yuv       video/x-raw-rgb       video/x-raw-gray ----------------------------------------------------------- 另如v4l2src元件,它只含有src衬垫功能,所以可以source一个视频流到另一个元件;再有ximagesink元件,它含有rgb格式sink衬垫功能。关于这部分更多详细介绍请关注 这里 。   显示一个视频测试图案 使用下面管道来显示一个视频测试图案 $ gst-launch videotestsrc ! autovideosink 其中autovideosink元件自动检测视频输出,videotestsrc元件可利用”pattern”属性生成多种格式的测试视频,如下面为雪花图案测试视频 $ gst-launch videotestsrc pattern=snow ! autovideosink   USB 摄像头 1). 从USB 摄像头显示视频 摄像头接入系统后,会在/dev目录下面显示对应的设备videox,x可以是0,1,2等等,取决于接入的摄像头数量。 请使用下面管道来全屏显示对应的摄像头视频 $ gst-launch v4l2src device=/dev/videox ! ffmpegcolorspace ! ximagesink // Video4Linux2插件是一个用于捕捉和播放视频的API和驱动框架,它支持多种USB摄像头以及其他设备;元件v4l2src属于Video4Linux2插件,用于读取Video4Linux2设备的视频帧,这里即为USB摄像头。Ffmpegcolorspace元件是一个用于转换多种颜色格式的过滤器,摄像头设备视频数据通常使用YUV颜色格式,而显示器通常使用RGB颜色格式。Ximagesink元件是一个X桌面标准的videosink元件。 在当前情况下,我们可以通过”top”命令看到目前CPU占有率为77.9% 另外,也可以通过设置一些参数来设定显示效果如尺寸,帧率等,如下面示例限定显示尺寸为320x240,此时CPU占有率下降为28.2% $ gst-launch v4l2src device=/dev/videox ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink 2). 同时显示两路USB 摄像头 使用下面通道来同时显示两路摄像头,这里我们使用Logitech HD 720P摄像头和另外一种普通的MJPEG摄像头,在这种情况下CPU占用率为64.8%。 $ gst-launch v4l2src device=/dev/videox ! 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink v4l2src device=/dev/video1 'video/x-raw-yuv,width=320,height=240,framerate=30/1' ! ffmpegcolorspace ! ximagesink 3). 录制USB 摄像头视频 使用下面管道来录制MP4格式摄像头视频 $ gst-launch --eos-on-shutdown v4l2src device=/dev/videox ! ffenc_mjpeg ! ffmux_mp4 ! filesink location=video.mp4 //--eos-on-shutdown参数用于正确关闭文件。ffenc_mjpeg元件是MJPEG格式编码器。ffmux_mp4是MP4格式合成器。filesink元件声明来自v4l2的源数据会被存储为文件而不是显示于ximagesink元件,另外也可以任意指定文件存储位置。 在这种情况下录制摄像头视频CPU占有率8%左右。 4). 视频播放 使用下面管道来播放上面录制的视频 $ gst-launch filesrc location=video.mp4 ! qtdemux name=demux demux.video_00 ! queue ! ffdec_mjpeg ! ffmpegcolorspace ! ximagesink //filesrc元件声明视频源数据来自于一个文件而不是视频设备如摄像头。ffdec_mjpeg元件为MJPEG格式解码器。 在这种情况下,由于所录制视频为摄像头最高分辨率,因此CPU占有率为95%左右。 5). 通过HTTP 播放视频 使用下面管道播放特定URL视频 $ gst-launch souphttpsrc location=http://upload.wikimedia.org/wikipedia/commons/4/4b/MS_Diana_genom_Bergs_slussar_16_maj_2014.webm ! matroskademux name=demux demux.video_00 ! queue ! ffdec_vp8 ! ffmpegcolorspace ! ximagesink // souphttpsrc元件用于通过HTTP接收网络数据。和播放本地视频不同,一个存放视频文件的网络地址制定给了location参数。ffdec_vp8元件是webm格式解码器。 在这种情况下,CPU占用率为40%左右。 6). 通过TCP 串流摄像头视频 这里配置串流VF61摄像头视频到另外一台运行Ubuntu Linux主机 VF61 IP = 192.168.0.8 Ubuntu IP = 192.168.0.7 在VF61上面运行下面管道 $ gst-launch v4l2src device=/dev/video1 ! video/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! ffenc_mjpeg ! tcpserversink host=192.168.0.7 port=5000 然后在Ubuntu上面运行下面管道来查看视频流 $ gst-launch tcpclientsrc host=192.168.0.8 port=5000 ! jpegdec ! autovideosink 这里使用Logitech HD 720P摄像头,CPU占有率为65%左右。   在 VF61 上面使用 D-Link IP 摄像头 1). 显示摄像头视频 这里使用D-Link DSC-930L 摄像头,并设置视频流为average quality JPEG格式,320x240分辨率,帧率为15/1’,IP = 192.168.0.200 使用下面管道来显示摄像头视频 $ gst-launch -v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2 ! ffmpegcolorspace ! ximagesink 2). 视频录制 使用下面管道来录制视频 $ gst-launch --eos-on-shutdown –v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2 ! ffmpegcolorspace ! ffenc_mjpeg ! ffmux_mp4 ! filesink location=stream.mp4 在这种情况下,CPU占有率为40% 左右。 3). 通过TCP 串流视频到另一个IP 地址 这里配置串流IP摄像头视频到VF61,然后再到另外一台运行Ubuntu Linux主机 Ubuntu IP = 192.168.0.12 在VF61上面运行下面管道 $ gst-launch --eos-on-shutdown –v souphttpsrc location='http://192.168.0.200/video.cgi' is-live=true ! multipartdemux ! decodebin2 ! ffmpegcolorspace ! ffenc_mjpeg ! Tcpserversink host=192.168.0.12 port 5000 然后在Ubuntu上面运行下面管道来查看视频流 $ gst-launch tcpclientsrc host=192.168.0.8 port=5000 ! jpegdec ! autovideosink 在这种情况下,VF61 CPU占用率为95% 左右。 总结 本文着重展示了通过GStreamer 元件,管道在Embedded Linux设备上面应用USB或IP摄像头,相关的更多管理使用示例还可以从网络上面找到很多,可以结合本文一起来充分理解相关应用。 本文所做示例下VF61 CPU和MEM占用率比较请见下图,可见NXP/Freescale Vybrid VF61 处理器尽管没有独立的硬件视频处理单元也基本上可以胜任基本的摄像头视频应用,结合其非常有竞争力的成本,使其成为性价比非常高的产品。但是如果对视频处理有更高的要求如嵌入式视觉系统,则建议考虑处理能力更强劲并含有独立GPU的基于NXP/Freescale i.MX6处理器的产品,如Toradex Colibri/Apalis i.MX6 模块。
  • 热度 1
    2013-11-18 21:57
    1025 次阅读|
    0 个评论
    In the late 1970s when I was still in college, I worked for a while as a broadcast engineer at a local TV station. Working a weekend sign-off shift allowed me to still go to classes during the day. Election night at a TV station is a true mad-house, with every scrap of equipment in use (and many personal bits and pieces in use, too), as they want to have as many remotes as possible. I'd drawn the assignment of "studio" (being physically handicapped sometimes has its own rewards). Of our four big studio cameras, two were out on location, leaving two at the studio. A little after 7:00 p.m., I noticed a problem with Camera 1, and after several attempts to adjust it failed, I told the directors that "Camera 1 just died"—not a welcome message, but they had to live with it. The chief engineer was "floating," and at the time he was at the studio, so he went out to the sound stage to see if it might be a quick fix, then shoved the camera over into the corner and forgot about it. A few days later, the engineer who normally worked on cameras started in on it, and after about two weeks, he gave up on it. The guy who normally worked on transmitters then spent about a week on it, with no more results. The chief engineer then spent about four days on it, with the same results. Then the guy who had the weeknight sign-off shift spent more than a week on it and still couldn't find the problem. Just before Christmas, I was getting a bit bored, since the only thing to do on a Saturday night shift was to take the hourly readings on the transmitter and record them. I figured since I (a) had nothing better to do and (b) had nothing to lose as I could do no worse than everybody else, I'd go ahead and take a shot at it. I did have two clues: First, by swapping heads between cameras, it had been proven that the problem was in the camera head (these cameras had a card case with a number of cards in the head, and about 30 inches or so of a 19-inch rack space of electronics back in the control room). Second, if the camera was left turned off for several hours, it would work for a couple of minutes when first turned on. So, I reckoned that it was some sort of thermal problem in the camera head. I dragged the camera out onto the newsroom floor, punched up so I could see the resulting picture from the camera on one of the newsroom monitors, went into the lab, and gathered up the board extenders for all of the boards in the camera head. I also armed myself with several cans of freeze mist and a heat gun. I started from one end of the card cage, putting one board at a time on the extender, then turning on the camera (fortunately, there was a circuit breaker on the camera head that could be used as a power switch), then waited for the camera to quit working. I then used the freeze mist to cool the board down to the point of it starting to get frosty, and observed the picture (or lack thereof). When there was no change, I powered down, returned the board to the card cage without the extender, and went to the next card. Eventually, I found one where cooling it did cause the picture to come back! AHA! The first real progress in many weeks! So I used the heat gun to get the board warm again, and the picture went away. Repeating the cycle, I was sure that I knew which board actually had the problem. I mentally divided the board in half, and by freezing only part of it, determined which half had the problem. I repeated the "divide and conquer" technique until I'd isolated it to about one square inch, whereupon I changed tactics. I brought out my trusty magnifying glass and carefully inspected that square inch. I discovered a cracked solder joint on a 5 watt resistor. Taking the board back into the lab, I broke out the soldering iron, and repaired the solder joint. Back in the newsroom, with the card still on its extender and power applied, the camera made a (sort of) nice picture even when the board was heated to "can't touch it" (maybe around 150 F). So, the basic problem was solved. I then spent the rest of that evening, and the next, having to readjust almost every setting in the camera, and finally had a really nice picture out of it. Clark Jones earned a BS in computer science in 1980 and worked in the semiconductor industry for 23 years. He worked as a principal design engineer for a start-up doing both hardware and software. He submitted this article as part of Frankenstein's Fix, a design contest hosted by EE Times (US).  
  • 热度 1
    2012-9-14 15:33
    669 次阅读|
    0 个评论
      分辨率系列: QSIF/QQVGA           160 x 120            19200 QCIF                         176 x 144            25344 SIF/QVGA                 320 x 240            76800 CIF                            352 x 288            101376                 10万像素 VGA                           640 x 480            307200                 30万像素(35万是指648X488) SVGA                        800 x 600            480000                 50万像素 XGA                          1024 x 768           786438                 80万像素 SXGA                        1280 x 1024        1310720              130万像素 UXGA                       1600 x 1200        1920000               200万像素 QXGA                        2048 x 1536        3145728              300万像素(320W) QSXGA                     2592 x 1944         5038848              500万像素                                   2816 x 2112        2947392               600万像素                                   3072 x 2304        7077888               700万像素                                   3200 x 2400        7680000               770万像素                                   3264 x 2448        7990272               800万像素                                   3876 x 2584        10015584            1000万像素   1080p  1920 x 1080 逐行扫描(1080i表示是隔行扫描,先扫奇数场,再扫偶数场); 720p      1280 x 720  逐行扫描(720i表示是隔行扫描,先扫奇数场,再扫偶数场);   OV5642能够以每秒15张画面 (fps) 的速度输出全分辨率视频。它还支持60 fps 的720p高清视频以及30 fps的1080p高清视频。 OV5642的特性:
相关资源
广告