Fork of https://gitlab.com/megapixels-org/Megapixels
Kristian Vos 95ae684d9f Pass color/forward matrices to libdng | 2 tygodni temu | |
---|---|---|
config | 1 rok temu | |
data | 2 miesięcy temu | |
src | 2 tygodni temu | |
tools | 1 rok temu | |
.clang-format | 3 lat temu | |
.editorconfig | 4 lat temu | |
.gitignore | 4 lat temu | |
.gitlab-ci.yml | 6 miesięcy temu | |
CMakeLists.txt | 4 miesięcy temu | |
LICENSE | 4 lat temu | |
README.md | 2 miesięcy temu | |
clang-format.sh | 2 lat temu | |
develop.sh | 2 miesięcy temu | |
medianame.h | 6 miesięcy temu | |
meson.build | 4 miesięcy temu | |
meson_options.txt | 3 lat temu | |
movie.sh.in | 6 miesięcy temu | |
movie_audio_rec.c | 6 miesięcy temu | |
mpegize.py | 2 miesięcy temu |
A GTK4 camera application that knows how to deal with the media request api. It uses opengl to debayer the raw sensor data for the preview.
Chat: #megapixels:brixit.nl on Matrix
Before building this, build and install libmegapixels, libdng and postprocessd.
$ meson build
$ cd build
$ ninja
$ sudo ninja install
$ sudo glib-compile-schemas /usr/local/share/glib-2.0/schemas
Megapixels only captures raw frames and stores .dng files. It captures a 5 frame burst and saves it to a temporary location. Then the postprocessing script is run which will generate the final .jpg file and writes it into the pictures directory. Megapixels looks for the post processing script in the following locations:
./postprocess.sh
$XDG_CONFIG_DIR/megapixels/postprocess.sh
~/.config/megapixels/postprocess.sh
/etc/megapixels/postprocess.sh
/usr/share/megapixels/postprocess.sh
The bundled postprocess.sh
script will copy the first frame of the burst into the picture directory as an DNG
file. If dcraw and imagemagick are installed it will generate a JPG and also write that to the picture
directory. It supports either the full dcraw
or dcraw_emu
from libraw.
It is possible to write your own post processing pipeline by providing your own postprocess.sh
script at
one of the above locations. The first argument to the script is the directory containing the temporary
burst files and the second argument is the final path for the image without an extension. For more details
see postprocess.sh
in this repository.
Video recording needs gstreamer framework, dcraw and imagemagick. You may need to "sudo apt install dcraw imagemagick".
Megapixels is developed at: https://gitlab.com/megapixels-org/megapixels
camera_config.c
describes how cameras are configured. Contains no state.main.c
contains the entry point and UI portion of the application.quickpreview.c
implements fast preview functionality, including debayering, color correction, rotation, etc.io_pipeline.c
implements all IO interaction with V4L2 devices in a separate thread to prevent blocking.process_pipeline.c
implements all process done on captured images, including launching post-processing.pipeline.c
Generic threaded message passing implementation based on glib, used to implement the pipelines.camera.c
V4L2 abstraction layer to make working with cameras easier.device.c
V4L2 abstraction layer for devices.The primary image pipeline consists of the main application, the IO pipeline and the process pipeline. The main application sends commands to the IO pipeline, which in turn talks to the process pipeline, which then talks to the main application. This way neither IO nor processing blocks the main application and races are generally avoided.
Tests are located in tests/
.