日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

【图像算法】彩色图像分割专题五:提取彩色图像上特定色彩

發布時間:2023/12/18 编程问答 26 豆豆
生活随笔 收集整理的這篇文章主要介紹了 【图像算法】彩色图像分割专题五:提取彩色图像上特定色彩 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

【圖像算法】彩色圖像分割專題五:提取彩色圖像特定色彩

????SkySeraph?Jun 8th 2011? HQU

Email:zgzhaobo@gmail.com? ? QQ:452728574

Latest Modified Date:Jun 8th 2011 HQU

一 原理及說明:

1? RGB(red,green,blue)模式是一種與設備相關的色彩空間,最常用的用途就是顯示器系統。RGB下,各分量關聯性太大,每個通道都編入了亮度信息,容易受周圍環境影響(光照等),其與人眼認知顏色的過程不太匹配,并不適合用來對彩色圖像進行分析和分割,相比下HSV空間是從人的視覺系統除法的,更適于圖像分析等。更多關于各種彩色空間模型請參考http://www.cnblogs.com/skyseraph/archive/2011/05/03/2035643.html

2??國內很多關于車牌識別的論文中,當利用到顏色信息時,一般都是在HSV/YIQ/Lab模式下,根據特定的車牌顏色信息(常見車牌顏色有:白底黑字、黑底白字、藍底白字、黃底黑字等),進行車牌分割進行的。 顏色的提取方法即本文所述。 這種方法只適合特定顏色的提取,用PR術語,類似"有監督學習";反之,無監督,對任意圖像進行顏色分割,屬于彩色分割領域。

3? 關于HSV范圍的劃分:

<1> 論文:Car color recognition from CCTV camera image:http://www.docin.com/p-211572110.html?

作者采用的是如下方式:

<2>論文:利用支持向量機識別汽車顏色:http://www.cnki.com.cn/Article/CJFDTotal-JSJF200405018.htm

作者首先是在Lab空間下分出16類顏色,然后再HSV下進行樣本空間分解,采用如下方式:

<3>本文根據實驗,采取劃分方式如源碼所示,在這種方式下,測試結果較好。

二?源碼:

/ // Note: 顏色分割:提取特定顏色 // Version: 5/11/2011 skyseraph zgzhaobo@gmail.com / void CColorSegDlg::ColorSegByHSV(IplImage* img) // 提取特定顏色 {//====================== 變量定義====================//int x,y; //循環//====================== 輸入彩色圖像信息====================//IplImage* pSrc = NULL;pSrc = cvCreateImage(cvGetSize(img),img->depth,img->nChannels);cvCopyImage(img,pSrc);int width = pSrc->width; //圖像寬度int height = pSrc->height; //圖像高度int depth = pSrc->depth; //圖像位深(IPL_DEPTH_8U...)int channels = pSrc->nChannels; //圖像通道數(1、2、3、4)int imgSize = pSrc->imageSize; //圖像大小 imageSize = height*widthStepint step = pSrc->widthStep/sizeof(uchar); //相鄰行的同列點之間的字節數: 注意widthStep != width*nChannels (有字節填零補充)uchar* data = (uchar *)pSrc->imageData;int imageLen = width*height; ////=========================================//double B=0.0,G=0.0,R=0.0,H=0.0,S=0.0,V=0.0;IplImage* dstColorSegByColor = cvCreateImage(cvGetSize(pSrc),IPL_DEPTH_8U,3);IplImage* dstColorSegByColorGray = cvCreateImage(cvGetSize(pSrc),IPL_DEPTH_8U,1);//CvFont font = cvFont( 1, 1 );for (y=0; y<height; y++){for ( x=0; x<width; x++){// 獲取BGR值B = ((uchar*)(pSrc->imageData + y*pSrc->widthStep))[x*pSrc->nChannels];G = ((uchar*)(pSrc->imageData + y*pSrc->widthStep))[x*pSrc->nChannels+1];R = ((uchar*)(pSrc->imageData + y*pSrc->widthStep))[x*pSrc->nChannels+2];// RGB-HSVpMyColorSpace.RGB2HSV(R,G,B,H,S,V); H = (360*H)/(2*PI); // 黑白//黑色if(V<0.35) {((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 0; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels] = 0; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 0; //R} //白色if(S<0.15 && V>0.75){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 255; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels] = 255; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 255; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 255; //R}//灰色if(S<0.15 && 0.35<V && V<0.75){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 128; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels] = 128; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 128; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 128; //R}// 彩色if(V>=0.35 && S>=0.15){//紅色相近if((H>=0 && H<15) || (H>=340 && H<360)){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 40; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 0; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 255; //R }//黃色相近else if(H>=15 && H<75){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 80; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 0; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 255; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 255; //R}//綠色相近else if(H>=75 && H<150){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 120; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 0; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 255; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 0; //R }///*//青色相近else if(H>=150 && H<185){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 160; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 255; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 255; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 0; //R}//*///藍色相近else if(H>=185 && H<270){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 200; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 255; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 0; //R } // /* //洋紅:270-340else if(H>=270 && H<340){((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 220; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels]= 255; //B((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 255; //R }//*/else{((uchar*)(dstColorSegByColorGray->imageData + y*dstColorSegByColorGray->widthStep))[x]= 180; //灰度((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels] = 128; //B //紫色Purple((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+1] = 0; //G((uchar*)(dstColorSegByColor->imageData + y*dstColorSegByColor->widthStep))[x*dstColorSegByColor->nChannels+2] = 128; //R}}}}//cvNamedWindow("src",1);//cvShowImage("src",pSrc);cvNamedWindow("dstColorSegByColor",1);cvShowImage("dstColorSegByColor",dstColorSegByColor);cvNamedWindow("dstColorSegByColorGray",1);cvShowImage("dstColorSegByColorGray",dstColorSegByColorGray);cvSaveImage(".\\dstColorSegByColor.jpg",dstColorSegByColor);cvSaveImage(".\\dstColorSegByColorGray.jpg",dstColorSegByColorGray);cvWaitKey(0);cvDestroyAllWindows();cvReleaseImage(&pSrc);cvReleaseImage(&dstColorSegByColor);cvReleaseImage(&dstColorSegByColorGray);}

三 效果:

(1)原圖

(2)顏色分割后彩色圖

(3)顏色分割后灰度圖(利用不同灰度級顯示)

??四 補充(RGB模式下,來源網絡)

1 源碼

void CFindRGBDlg::OnFind() {int color=m_colorList.GetCurSel();pic=cvCreateImage( cvSize(image->width,image->height), 8, 1 );cvZero(pic);for(int x=0;x<image->height;x++){for(int y=0;y<image->width;y++) {uchar* ptrImg = &CV_IMAGE_ELEM(image,uchar,x,y*3);// uchar* ptrPic = &((uchar*)(pic->imageData + pic->widthStep*y))[x];//redif(color==0){if((ptrImg[0]-ptrImg[1])>200&&(ptrImg[0]-ptrImg[2])>200)CV_IMAGE_ELEM(pic,uchar,x,y)=255;}//Greenelse if(color==1){if((ptrImg[1]-ptrImg[0])>200&&(ptrImg[1]-ptrImg[2])>200)CV_IMAGE_ELEM(pic,uchar,x,y)=255;}//blueelse if(color==2){if((ptrImg[2]-ptrImg[0])>200&&(ptrImg[2]-ptrImg[1])>200)CV_IMAGE_ELEM(pic,uchar,x,y)=255;}}}cvNamedWindow("temp",-1);cvShowImage("temp",pic);cvWaitKey();storage = cvCreateMemStorage(0);contour = 0;mode = CV_RETR_EXTERNAL;cvFindContours( pic, storage, &contour, sizeof(CvContour), mode, CV_CHAIN_APPROX_SIMPLE);cvDrawContours(image, contour, CV_RGB(0,0,0), CV_RGB(0, 0, 0), 2, 2, 8);CRect rect; GetDlgItem(IDC_PICTURE)->GetClientRect(&rect); InvalidateRect(rect,true); }

2 效果:

?

More in ?http://skyseraph.com/2011/08/27/CV/圖像算法專題/?

?

Author:???????? SKySeraph

Email/GTalk: zgzhaobo@gmail.com ???QQ:452728574

From:???????? http://www.cnblogs.com/skyseraph/

本文版權歸作者和博客園共有,歡迎轉載,但未經作者同意必須保留此段聲明,且在文章頁面明顯位置給出原文連接,請尊重作者的勞動成果

轉載于:https://www.cnblogs.com/skyseraph/archive/2011/06/08/2075599.html

總結

以上是生活随笔為你收集整理的【图像算法】彩色图像分割专题五:提取彩色图像上特定色彩的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。