This repository hosts ESP32 series Soc compatible driver for image sensors. Additionally it provides a few tools, which allow converting the captured frame data to the more common BMP and JPEG formats.
- ESP32
- ESP32-S2
- ESP32-S3
model | max resolution | color type | output format | Len Size |
---|---|---|---|---|
OV2640 | 1600 x 1200 | color | YUV(422/420)/YCbCr422 RGB565/555 8-bit compressed data 8/10-bit Raw RGB data |
1/4" |
OV3660 | 2048 x 1536 | color | raw RGB data RGB565/555/444 CCIR656 YCbCr422 compression |
1/5" |
OV5640 | 2592 x 1944 | color | RAW RGB RGB565/555/444 CCIR656 YUV422/420 YCbCr422 compression |
1/4" |
OV7670 | 640 x 480 | color | Raw Bayer RGB Processed Bayer RGB YUV/YCbCr422 GRB422 RGB565/555 |
1/6" |
OV7725 | 640 x 480 | color | Raw RGB GRB 422 RGB565/555/444 YCbCr 422 |
1/4" |
NT99141 | 1280 x 720 | color | YCbCr 422 RGB565/555/444 Raw CCIR656 JPEG compression |
1/4" |
GC032A | 640 x 480 | color | YUV/YCbCr422 RAW Bayer RGB565 |
1/10" |
GC0308 | 640 x 480 | color | YUV/YCbCr422 RAW Bayer RGB565 |
1/6.5" |
GC2145 | 1600 x 1200 | color | YUV/YCbCr422 RAW Bayer RGB565 |
1/5" |
BF3005 | 640 x 480 | color | YUV/YCbCr422 RAW Bayer RGB565 |
1/4" |
BF20A6 | 640 x 480 | color | YUV/YCbCr422 RAW Bayer |
1/10" |
SC101IOT | 1280 x 720 | color | YUV/YCbCr422 Raw RGB |
1/4.2" |
SC030IOT | 640 x 480 | color | YUV/YCbCr422 RAW Bayer |
1/6.5" |
SC031GS | 640 x 480 | monochrome | RAW MONO Grayscale |
1/6" |
- Except when using CIF or lower resolution with JPEG, the driver requires PSRAM to be installed and activated.
- Using YUV or RGB puts a lot of strain on the chip because writing to PSRAM is not particularly fast. The result is that image data might be missing. This is particularly true if WiFi is enabled. If you need RGB data, it is recommended that JPEG is captured and then turned into RGB using
fmt2rgb888
orfmt2bmp
/frame2bmp
. - When 1 frame buffer is used, the driver will wait for the current frame to finish (VSYNC) and start I2S DMA. After the frame is acquired, I2S will be stopped and the frame buffer returned to the application. This approach gives more control over the system, but results in longer time to get the frame.
- When 2 or more frame bufers are used, I2S is running in continuous mode and each frame is pushed to a queue that the application can access. This approach puts more strain on the CPU/Memory, but allows for double the frame rate. Please use only with JPEG.
- Clone or download and extract the repository to the components folder of your ESP-IDF project
- Enable PSRAM in
menuconfig
(also set Flash and PSRAM frequiencies to 80MHz) - Include
esp_camera.h
in your code
The easy way -- on the env
section of platformio.ini
, add the following:
[env]
lib_deps =
esp32-camera
Now the esp_camera.h
is available to be included:
#include "esp_camera.h"
Enable PSRAM on menuconfig
or type it direclty on sdkconfig
. Check the official doc for more info.
CONFIG_ESP32_SPIRAM_SUPPORT=y
Arduino The easy-way (content above) only seems to work if you're using framework=arduino
which seems to take a bunch of the guesswork out (thanks Arduino!) but also suck up a lot more memory and flash, almost crippling the performance. If you plan to use the framework=espidf
then read the sections below carefully!!
It's probably easier to just skip the platform.io library registry version and link the git repo as a submodule. (i.e. using code outside the platform.io library management). In this example we will install this as a submodule inside the platform.io $project/lib folder:
cd $project\lib
git submodule add -b master https://github.com/espressif/esp32-camera.git
Then in platformio.ini
file
build_flags =
-I../lib/esp32-camera
After that #include "esp_camera.h"
statement will be available. Now the module is included, and you're hopefully back to the same place as the easy-Arduino way.
Warning about platform.io/espidf and fresh (not initialized) git repos
There is a sharp-edge on you'll discover in the platform.io build process (in espidf v3.3 & 4.0.1) where a project which has only had git init
but nothing committed will crash platform.io build process with highly non-useful output. The cause is due to lack of a version (making you think you did something wrong, when you didn't at all) - the output is horribly non-descript. Solution: the devs want you to create a file called version.txt with a number in it, or simply commit any file to the projects git repo and use git. This happens because platform.io build process tries to be too clever and determine the build version number from the git repo - it's a sharp edge you'll only encounter if you're experimenting on a new project with no commits .. like wtf is my camera not working let's try a 'clean project'?!
Kconfig is used by the platform.io menuconfig (accessed by running: pio run -t menuconfig
) to interactively manage the various #ifdef statements throughout the espidf and supporting libraries (i.e. this repo: esp32-camera and arduino-esp32.git). The menuconfig process generates the sdkconfig
file which is ultimately used behind the scenes by espidf compile+build process.
Make sure to append or symlink this Kconfig
content into the Kconfig
of your project.
You symlink (or copy) the included Kconfig into your platform.io projects src directory. The file should be named Kconfig.projbuild
in your projects src\ directory or you could also add the library path to a CMakefile.txt and hope the Kconfig
(or Kconfig.projbuild
) gets discovered by the menuconfig process, though this unpredictable for me.
The unpredictable wonky behavior in platform.io build process around Kconfig naming (Kconfig vs. Kconfig.projbuild) occurs between espidf versions 3.3 and 4.0 - but if you don't see "Camera configuration" in your pio run -t menuconfig
then there is no point trying to test camera code (it may compile, but it probably won't work!) and it seems the platform.io devs (when they built their wrapper around the espidf menuconfig) didn't implement it properly. You've probably already figured out you can't use the espidf build tools since the files are in totally different locations and also different versions with sometimes different syntax. This is one of those times you might consider changing the platformio.ini
from platform=espressif32
to platform=https://github.com/platformio/platform-espressif32.git#develop
to get a more recent version of the espidf 4.0 tools.
However with a bit of patience and experimenting you'll figure the Kconfig out. Once Kconfig (or Kconfig.projbuild) is working then you will be able to choose the configurations according to your setup or the camera libraries will be compiled. Although you might also need to delete your .pio/build directory before the options appear .. again, the pio run -t menuconfig
doens't always notice the new Kconfig files!
If you miss-skip-ignore this critical step the camera module will compile but camera logic inside the library will be 'empty' because the Kconfig sets the proper #ifdef statements during the build process to initialize the selected cameras. It's very not optional!
#include "esp_camera.h"
//WROVER-KIT PIN Map
#define CAM_PIN_PWDN -1 //power down is not used
#define CAM_PIN_RESET -1 //software reset will be performed
#define CAM_PIN_XCLK 21
#define CAM_PIN_SIOD 26
#define CAM_PIN_SIOC 27
#define CAM_PIN_D7 35
#define CAM_PIN_D6 34
#define CAM_PIN_D5 39
#define CAM_PIN_D4 36
#define CAM_PIN_D3 19
#define CAM_PIN_D2 18
#define CAM_PIN_D1 5
#define CAM_PIN_D0 4
#define CAM_PIN_VSYNC 25
#define CAM_PIN_HREF 23
#define CAM_PIN_PCLK 22
static camera_config_t camera_config = {
.pin_pwdn = CAM_PIN_PWDN,
.pin_reset = CAM_PIN_RESET,
.pin_xclk = CAM_PIN_XCLK,
.pin_sccb_sda = CAM_PIN_SIOD,
.pin_sccb_scl = CAM_PIN_SIOC,
.pin_d7 = CAM_PIN_D7,
.pin_d6 = CAM_PIN_D6,
.pin_d5 = CAM_PIN_D5,
.pin_d4 = CAM_PIN_D4,
.pin_d3 = CAM_PIN_D3,
.pin_d2 = CAM_PIN_D2,
.pin_d1 = CAM_PIN_D1,
.pin_d0 = CAM_PIN_D0,
.pin_vsync = CAM_PIN_VSYNC,
.pin_href = CAM_PIN_HREF,
.pin_pclk = CAM_PIN_PCLK,
.xclk_freq_hz = 20000000,//EXPERIMENTAL: Set to 16MHz on ESP32-S2 or ESP32-S3 to enable EDMA mode
.ledc_timer = LEDC_TIMER_0,
.ledc_channel = LEDC_CHANNEL_0,
.pixel_format = PIXFORMAT_JPEG,//YUV422,GRAYSCALE,RGB565,JPEG
.frame_size = FRAMESIZE_UXGA,//QQVGA-UXGA, For ESP32, do not use sizes above QVGA when not JPEG. The performance of the ESP32-S series has improved a lot, but JPEG mode always gives better frame rates.
.jpeg_quality = 12, //0-63, for OV series camera sensors, lower number means higher quality
.fb_count = 1, //When jpeg mode is used, if fb_count more than one, the driver will work in continuous mode.
.grab_mode = CAMERA_GRAB_WHEN_EMPTY//CAMERA_GRAB_LATEST. Sets when buffers should be filled
};
esp_err_t camera_init(){
//power up the camera if PWDN pin is defined
if(CAM_PIN_PWDN != -1){
pinMode(CAM_PIN_PWDN, OUTPUT);
digitalWrite(CAM_PIN_PWDN, LOW);
}
//initialize the camera
esp_err_t err = esp_camera_init(&camera_config);
if (err != ESP_OK) {
ESP_LOGE(TAG, "Camera Init Failed");
return err;
}
return ESP_OK;
}
esp_err_t camera_capture(){
//acquire a frame
camera_fb_t * fb = esp_camera_fb_get();
if (!fb) {
ESP_LOGE(TAG, "Camera Capture Failed");
return ESP_FAIL;
}
//replace this with your own function
process_image(fb->width, fb->height, fb->format, fb->buf, fb->len);
//return the frame buffer back to the driver for reuse
esp_camera_fb_return(fb);
return ESP_OK;
}
#include "esp_camera.h"
#include "esp_http_server.h"
#include "esp_timer.h"
typedef struct {
httpd_req_t *req;
size_t len;
} jpg_chunking_t;
static size_t jpg_encode_stream(void * arg, size_t index, const void* data, size_t len){
jpg_chunking_t *j = (jpg_chunking_t *)arg;
if(!index){
j->len = 0;
}
if(httpd_resp_send_chunk(j->req, (const char *)data, len) != ESP_OK){
return 0;
}
j->len += len;
return len;
}
esp_err_t jpg_httpd_handler(httpd_req_t *req){
camera_fb_t * fb = NULL;
esp_err_t res = ESP_OK;
size_t fb_len = 0;
int64_t fr_start = esp_timer_get_time();
fb = esp_camera_fb_get();
if (!fb) {
ESP_LOGE(TAG, "Camera capture failed");
httpd_resp_send_500(req);
return ESP_FAIL;
}
res = httpd_resp_set_type(req, "image/jpeg");
if(res == ESP_OK){
res = httpd_resp_set_hdr(req, "Content-Disposition", "inline; filename=capture.jpg");
}
if(res == ESP_OK){
if(fb->format == PIXFORMAT_JPEG){
fb_len = fb->len;
res = httpd_resp_send(req, (const char *)fb->buf, fb->len);
} else {
jpg_chunking_t jchunk = {req, 0};
res = frame2jpg_cb(fb, 80, jpg_encode_stream, &jchunk)?ESP_OK:ESP_FAIL;
httpd_resp_send_chunk(req, NULL, 0);
fb_len = jchunk.len;
}
}
esp_camera_fb_return(fb);
int64_t fr_end = esp_timer_get_time();
ESP_LOGI(TAG, "JPG: %uKB %ums", (uint32_t)(fb_len/1024), (uint32_t)((fr_end - fr_start)/1000));
return res;
}
#include "esp_camera.h"
#include "esp_http_server.h"
#include "esp_timer.h"
#define PART_BOUNDARY "123456789000000000000987654321"
static const char* _STREAM_CONTENT_TYPE = "multipart/x-mixed-replace;boundary=" PART_BOUNDARY;
static const char* _STREAM_BOUNDARY = "\r\n--" PART_BOUNDARY "\r\n";
static const char* _STREAM_PART = "Content-Type: image/jpeg\r\nContent-Length: %u\r\n\r\n";
esp_err_t jpg_stream_httpd_handler(httpd_req_t *req){
camera_fb_t * fb = NULL;
esp_err_t res = ESP_OK;
size_t _jpg_buf_len;
uint8_t * _jpg_buf;
char * part_buf[64];
static int64_t last_frame = 0;
if(!last_frame) {
last_frame = esp_timer_get_time();
}
res = httpd_resp_set_type(req, _STREAM_CONTENT_TYPE);
if(res != ESP_OK){
return res;
}
while(true){
fb = esp_camera_fb_get();
if (!fb) {
ESP_LOGE(TAG, "Camera capture failed");
res = ESP_FAIL;
break;
}
if(fb->format != PIXFORMAT_JPEG){
bool jpeg_converted = frame2jpg(fb, 80, &_jpg_buf, &_jpg_buf_len);
if(!jpeg_converted){
ESP_LOGE(TAG, "JPEG compression failed");
esp_camera_fb_return(fb);
res = ESP_FAIL;
}
} else {
_jpg_buf_len = fb->len;
_jpg_buf = fb->buf;
}
if(res == ESP_OK){
res = httpd_resp_send_chunk(req, _STREAM_BOUNDARY, strlen(_STREAM_BOUNDARY));
}
if(res == ESP_OK){
size_t hlen = snprintf((char *)part_buf, 64, _STREAM_PART, _jpg_buf_len);
res = httpd_resp_send_chunk(req, (const char *)part_buf, hlen);
}
if(res == ESP_OK){
res = httpd_resp_send_chunk(req, (const char *)_jpg_buf, _jpg_buf_len);
}
if(fb->format != PIXFORMAT_JPEG){
free(_jpg_buf);
}
esp_camera_fb_return(fb);
if(res != ESP_OK){
break;
}
int64_t fr_end = esp_timer_get_time();
int64_t frame_time = fr_end - last_frame;
last_frame = fr_end;
frame_time /= 1000;
ESP_LOGI(TAG, "MJPG: %uKB %ums (%.1ffps)",
(uint32_t)(_jpg_buf_len/1024),
(uint32_t)frame_time, 1000.0 / (uint32_t)frame_time);
}
last_frame = 0;
return res;
}
#include "esp_camera.h"
#include "esp_http_server.h"
#include "esp_timer.h"
esp_err_t bmp_httpd_handler(httpd_req_t *req){
camera_fb_t * fb = NULL;
esp_err_t res = ESP_OK;
int64_t fr_start = esp_timer_get_time();
fb = esp_camera_fb_get();
if (!fb) {
ESP_LOGE(TAG, "Camera capture failed");
httpd_resp_send_500(req);
return ESP_FAIL;
}
uint8_t * buf = NULL;
size_t buf_len = 0;
bool converted = frame2bmp(fb, &buf, &buf_len);
esp_camera_fb_return(fb);
if(!converted){
ESP_LOGE(TAG, "BMP conversion failed");
httpd_resp_send_500(req);
return ESP_FAIL;
}
res = httpd_resp_set_type(req, "image/x-windows-bmp")
|| httpd_resp_set_hdr(req, "Content-Disposition", "inline; filename=capture.bmp")
|| httpd_resp_send(req, (const char *)buf, buf_len);
free(buf);
int64_t fr_end = esp_timer_get_time();
ESP_LOGI(TAG, "BMP: %uKB %ums", (uint32_t)(buf_len/1024), (uint32_t)((fr_end - fr_start)/1000));
return res;
}