Fork of https://gitlab.com/megapixels-org/Megapixels
Pavel Machek eeca8fe4cc auto: cleanups | il y a 2 mois | |
---|---|---|
config | il y a 1 an | |
data | il y a 3 mois | |
src | il y a 2 mois | |
tools | il y a 1 an | |
.clang-format | il y a 3 ans | |
.editorconfig | il y a 4 ans | |
.gitignore | il y a 4 ans | |
.gitlab-ci.yml | il y a 6 mois | |
CMakeLists.txt | il y a 4 mois | |
LICENSE | il y a 4 ans | |
README.md | il y a 2 mois | |
clang-format.sh | il y a 2 ans | |
medianame.h | il y a 6 mois | |
meson.build | il y a 4 mois | |
meson_options.txt | il y a 3 ans | |
movie.sh.in | il y a 6 mois | |
movie_audio_rec.c | il y a 6 mois | |
mpegize.py | il y a 2 mois |
A GTK4 camera application that knows how to deal with the media request api. It uses opengl to debayer the raw sensor data for the preview.
Chat: #megapixels:brixit.nl on Matrix
Before building this, build and install libmegapixels, libdng and postprocessd.
$ meson build
$ cd build
$ ninja
$ sudo ninja install
$ sudo glib-compile-schemas /usr/local/share/glib-2.0/schemas
Megapixels only captures raw frames and stores .dng files. It captures a 5 frame burst and saves it to a temporary location. Then the postprocessing script is run which will generate the final .jpg file and writes it into the pictures directory. Megapixels looks for the post processing script in the following locations:
./postprocess.sh
$XDG_CONFIG_DIR/megapixels/postprocess.sh
~/.config/megapixels/postprocess.sh
/etc/megapixels/postprocess.sh
/usr/share/megapixels/postprocess.sh
The bundled postprocess.sh
script will copy the first frame of the burst into the picture directory as an DNG
file. If dcraw and imagemagick are installed it will generate a JPG and also write that to the picture
directory. It supports either the full dcraw
or dcraw_emu
from libraw.
It is possible to write your own post processing pipeline by providing your own postprocess.sh
script at
one of the above locations. The first argument to the script is the directory containing the temporary
burst files and the second argument is the final path for the image without an extension. For more details
see postprocess.sh
in this repository.
Video recording needs gstreamer framework, dcraw and imagemagick. You may need to "sudo apt install dcraw imagemagick".
Megapixels is developed at: https://gitlab.com/megapixels-org/megapixels
camera_config.c
describes how cameras are configured. Contains no state.main.c
contains the entry point and UI portion of the application.quickpreview.c
implements fast preview functionality, including debayering, color correction, rotation, etc.io_pipeline.c
implements all IO interaction with V4L2 devices in a separate thread to prevent blocking.process_pipeline.c
implements all process done on captured images, including launching post-processing.pipeline.c
Generic threaded message passing implementation based on glib, used to implement the pipelines.camera.c
V4L2 abstraction layer to make working with cameras easier.device.c
V4L2 abstraction layer for devices.The primary image pipeline consists of the main application, the IO pipeline and the process pipeline. The main application sends commands to the IO pipeline, which in turn talks to the process pipeline, which then talks to the main application. This way neither IO nor processing blocks the main application and races are generally avoided.
Tests are located in tests/
.