merge master to thesis branch

This commit is contained in:
ElonOlsson
2021-08-16 15:23:49 -04:00
203 changed files with 4345 additions and 4065 deletions

13
.gitattributes vendored
View File

@@ -1,5 +1,18 @@
* text=auto
# Correct GitHub's language detection shenanigans
# Asset files are not Unity, but Lua instead
*.asset linguist-language=Lua
# We have some SPICE frame kernels that get misclassified as code
*.tf -linguist-detectable
# We don't want to index the GDAL csv and xml files
modules/globebrowsing/gdal_data/* linguist-vendored
# No need to index any external files
*/ext/* linguist-vendored
# No C allowed
*.h linguist-language=C++
# GitHub files
ATTRIBUTION text
AUTHORS text

View File

@@ -28,8 +28,8 @@ project(OpenSpace)
set(OPENSPACE_VERSION_MAJOR 0)
set(OPENSPACE_VERSION_MINOR 17)
set(OPENSPACE_VERSION_PATCH -1)
set(OPENSPACE_VERSION_STRING "Beta-10 [RC1]")
set(OPENSPACE_VERSION_PATCH 1)
set(OPENSPACE_VERSION_STRING "Beta-10")
set(OPENSPACE_BASE_DIR "${PROJECT_SOURCE_DIR}")
set(OPENSPACE_CMAKE_EXT_DIR "${OPENSPACE_BASE_DIR}/support/cmake")
@@ -132,8 +132,6 @@ if (MSVC)
set(GHOUL_OPTIMIZATION_ENABLE_OTHER_OPTIMIZATIONS ${OPENSPACE_OPTIMIZATION_ENABLE_OTHER_OPTIMIZATIONS} CACHE BOOL "" FORCE)
endif ()
option(OPENSPACE_WITH_ABUFFER_RENDERER "Compile ABuffer Renderer" OFF)
if (UNIX)
if (CMAKE_CXX_COMPILER_ID MATCHES "Clang")
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -std=c++17 -stdlib=libc++")

View File

@@ -4,17 +4,17 @@ Emil Axelsson
Kalle Bladin
Jonathas Costa
Gene Payne
Jonas Strandstedt
Michal Marcinkowski
Elon Olsson
Emma Broman
Jonas Strandstedt
Micah Acinapura
Michal Marcinkowski
Malin Ejdbo
Elon Olsson
Joakim Kilby
Lovisa Hassler
Mikael Petterson
Erik Sundén
Stefan Lindblad
Malin Ejdbo
Corrie Roe
Eric Myers

View File

@@ -1,13 +1,47 @@
[OpenSpace](http://openspaceproject.com) is an open source, non-commercial, and freely available interactive data visualization software designed to visualize the entire known universe and portray our ongoing efforts to investigate the cosmos. Bringing the latest techniques from data visualization research to the general public, OpenSpace supports interactive presentation of dynamic data from observations, simulations, and space mission planning and operations. The software works on multiple operating systems (Windows, Linux, MacOS) with an extensible architecture powering high resolution tiled displays and planetarium domes, making use of the latest graphic card technologies for rapid data throughput. In addition, OpenSpace enables simultaneous connections across the globe creating opportunity for shared experiences among audiences worldwide.
![OpenSpace Logo](/data/openspace-horiz-logo-crop.png)
[OpenSpace](http://openspaceproject.com) is an open source, non-commercial, and freely available interactive data visualization software designed to visualize the entire known universe and portray our ongoing efforts to investigate the cosmos. Bringing the latest techniques from data visualization research to the general public, OpenSpace supports interactive presentation of dynamic data from observations, simulations, and space mission planning and operations. The software works on multiple operating systems (Windows, Linux, MacOS) with an extensible architecture capable of powering both personal computers and also high resolution tiled displays and planetarium domes. In addition, OpenSpace enables simultaneous connections across the globe creating opportunity for shared experiences among audiences worldwide. The target audience of the software reaches from the general public who wishes to explore our universe, enthusiasts interested in hacking the underlying components in OpenSpace to create unique experiences, informal science institutions wishing to create a low-cost, yet powerful exhibition piece, but also scientists desiring to visualize their datasets in a contextualized, powerful software.
The project stems from the same academic collaboration between Swedens [Linköping University](https://www.liu.se) (LiU) and the [American Museum of Natural History](https://www.amnh.org) (AMNH) that led to the creation of Uniview and its parent company [SCISS](http://sciss.se). Development of the software began several years ago through a close collaboration with NASA Goddards [Community Coordinated Modeling Center](https://ccmc.gsfc.nasa.gov) (CCMC) to model space weather forecasting and continued with visualizations of NASAs New Horizons mission to Pluto and ESAs Rosetta mission. This promising set of preliminary work provided a foundation for recent NASA funding, which has extended the collaboration to include the University of Utahs [Scientific Computing and Imaging](https://www.sci.utah.edu) (SCI) Institute, [New York University](https://www.nyu.edu)s Tandon School of Engineering, multiple informal science institutions across the United States, and multiple, international vendors. Current areas of focus within OpenSpace include:
[![License](https://img.shields.io/badge/License-MIT-purple.svg?style=flat-square)](LICENSE)
[![Download](https://img.shields.io/github/v/tag/OpenSpace/OpenSpace?label=Version&color=maroon&style=flat-square)](https://www.openspaceproject.com/installation)
![Size](https://img.shields.io/github/repo-size/OpenSpace/OpenSpace?style=flat-square&color=red)
- Visualization of dynamic simulations via interactive volumetric rendering, as a priority for communicating research in astrophysics.
- Utilization of NASAs SPICE observational geometry system with its Planetary Data Service (PDS) to enable space mission visualizations that reveal how missions are designed to gather science.
- Globe browsing techniques across spatial and temporal scales to examine scientific campaigns on multiple planets, including close up surface exploration.
[![System Paper](https://img.shields.io/badge/System%20Paper-10.1109%2FTVCG.2019.2934259-blue?style=flat-square)](https://doi.org/10.1109/TVCG.2019.2934259)
[![GlobeBrowsing Paper](https://img.shields.io/badge/GlobeBrowsing%20Paper-https%3A%2F%2Fdoi.org%2F10.1109%2FTVCG.2017.2743958-blue?style=flat-square)](https://doi.org/10.1109/TVCG.2017.2743958)
OpenSpace requires graphics support for [OpenGL](https://www.opengl.org/) version 3.3.
![Contributors](https://img.shields.io/github/contributors/OpenSpace/OpenSpace?style=flat-square)
![Commits](https://img.shields.io/github/commit-activity/m/OpenSpace/OpenSpace?color=green&style=flat-square)
This repository contains the source code and example scenes for OpenSpace, but does not contain any data. To build and install the client, we refer to the [OpenSpace Wiki](http://wiki.openspaceproject.com/), specifically [building](http://wiki.openspaceproject.com/docs/developers/compiling/general) for [Windows](http://wiki.openspaceproject.com/docs/developers/compiling/windows), [Linux (Ubuntu)](http://wiki.openspaceproject.com/docs/developers/compiling/ubuntu), and [MacOS](http://wiki.openspaceproject.com/docs/developers/compiling/macos). Required preexisting dependencies are: [Boost](http://www.boost.org/) and [Qt](http://www.qt.io/download). Feel free to create issues for missing features, bug reports, or compile problems or contact us via [email](mailto:alexander.bock@me.com?subject=OpenSpace:).
![Image](https://github.com/OpenSpace/openspace.github.io/raw/master/assets/images/collection.jpg)
Regarding any issues, you are very welcome on our [Slack support channel](https://openspacesupport.slack.com) to which you can freely [sign-up](https://join.slack.com/t/openspacesupport/shared_invite/zt-37niq6y9-T0JaCIk4UoFLI4VF5U9Vsw).
# Background
OpenSpace started as a collaboration between Sweden's [Linköping University](https://scivis.github.io) (LiU) and the [American Museum of Natural History](https://www.amnh.org) (AMNH). Development of the software began several years ago through a close collaboration with NASA Goddard's [Community Coordinated Modeling Center](https://ccmc.gsfc.nasa.gov) (CCMC) to model space weather forecasting and continued with visualizations of NASA's New Horizons mission to Pluto and ESA's Rosetta mission to 67P/ChuryumovGerasimenko. This promising set of preliminary work provided a foundation for continued funding from NASA, the Swedish eScience Research Centre, and the Knut and Alice Wallenberg foundation, which has extended the collaboration to include the University of Utah's [Scientific Computing and Imaging](https://www.sci.utah.edu) (SCI) Institute, [New York University](https://www.nyu.edu)'s Tandon School of Engineering, multiple informal science institutions across the world, and multiple, international vendors.
![Image](https://github.com/OpenSpace/openspace.github.io/raw/master/assets/images/presentation.jpg)
# Features
Some of the high-level features supported in OpenSpace are:
- AMNH's Digital Universe catalog of extrasolar datasets (stars, galaxies, quasars, ...)
- High-resolution planetary images for major objects in the solar system (Earth, Moon, Mars, Venus, ...)
- Animated 3D models representing space missions (ISS, New Horizons, JWST, ...)
- Support for custom profiles with arbitrary user-defined content
- Ability to drive any type of display environment (flat screen, multi-projector, planetariums, ...)
- Lua and JavaScript interface into the engine allowing highly customized controls
- Native support to export an interactive sessions as individual frames for video export
- much much more (see our [Changelog](http://wiki.openspaceproject.com/docs/general/releases))
OpenSpace requires at least support for [OpenGL](https://www.opengl.org/) version 3.3, some custom components require at least version 4.2.
![Image](https://github.com/OpenSpace/openspace.github.io/raw/master/assets/images/display-systems.jpg)
# Getting Started
This repository contains the source code and example profiles for OpenSpace, but does not contain any data. To build and install the application, please check out the [OpenSpace Wiki](http://wiki.openspaceproject.com/). Here, you will find two pages, a [build instruction](http://wiki.openspaceproject.com/docs/developers/compiling/general) for all operating systems and then additional instructions for [Windows](http://wiki.openspaceproject.com/docs/developers/compiling/windows), [Linux (Ubuntu)](http://wiki.openspaceproject.com/docs/developers/compiling/ubuntu), and [MacOS](http://wiki.openspaceproject.com/docs/developers/compiling/macos).
Requirements for compiling are:
- CMake version 3.10 or above
- C++ compiler supporting C++17 (MSVC 16.10, GCC9, Clang10)
- [Boost](http://www.boost.org/)
- [Qt](http://www.qt.io/download)
Feel free to create issues for missing features, bug reports, or compile problems or contact us via [email](mailto:openspace@amnh.org?subject=OpenSpace:). Regarding any issues, you are very welcome on our [Slack support channel](https://openspacesupport.slack.com) to which you can freely [sign-up](https://join.slack.com/t/openspacesupport/shared_invite/zt-37niq6y9-T0JaCIk4UoFLI4VF5U9Vsw).
![Image](https://github.com/OpenSpace/openspace.github.io/raw/master/assets/images/himalaya-nkpg-dome.jpg)

View File

@@ -28,6 +28,7 @@
#include <QDialog>
#include <openspace/scene/profile.h>
#include <openspace/util/keys.h>
#include <QWidget>
#include <QListWidgetItem>
@@ -68,6 +69,7 @@ private slots:
void parseSelections();
void chooseScripts();
void keySelected(int index);
void keyModSelected(int index);
/**
* Adds scripts to the _scriptEdit from outside dialogs
@@ -83,12 +85,16 @@ private:
int indexInKeyMapping(std::vector<int>& mapVector, int keyInt);
bool areRequiredFormsFilled();
bool isLineEmpty(int index);
void addStringToErrorDisplay(const QString& newString);
void checkForNumberKeyConflict(int key);
void checkForBindingConflict(int selectedModKey, int selectedKey);
openspace::Profile& _profile;
std::vector<openspace::Profile::Keybinding> _data;
std::vector<int> _mapModKeyComboBoxIndexToKeyValue;
std::vector<int> _mapKeyComboBoxIndexToKeyValue;
bool _editModeNewItem = false;
int _currentKeybindingSelection = 0;
QListWidget* _list = nullptr;
QLabel* _keyModLabel = nullptr;

View File

@@ -115,6 +115,7 @@ PropertiesDialog QListWidget {
*/
AssetsDialog QTreeView {
min-width: 40em;
min-height: 40em;
}
/*

View File

@@ -263,7 +263,7 @@ QWidget* LauncherWindow::createCentralWidget() {
[this]() {
const std::string selection = _profileBox->currentText().toStdString();
int selectedIndex = _profileBox->currentIndex();
bool isUserProfile = selectedIndex <= _userAssetCount;
bool isUserProfile = selectedIndex < _userAssetCount;
openProfileEditor(selection, isUserProfile);
}
);
@@ -312,8 +312,24 @@ void LauncherWindow::setBackgroundImage(const std::string& syncPath) {
std::mt19937 g(rd());
std::shuffle(files.begin(), files.end(), g);
// We know there has to be at least one folder, so it's fine to just pick the first
std::string image = files.front();
_backgroundImage->setPixmap(QPixmap(QString::fromStdString(image)));
while (!files.empty()) {
std::string p = files.front();
if (std::filesystem::path(p).extension() == ".png") {
// If the top path starts with the png extension, we have found our candidate
break;
}
else {
// There shouldn't be any non-png images in here, but you never know. So we
// just remove non-image files here
files.erase(files.begin());
}
}
// There better be at least one file left, but just in in case
if (!files.empty()) {
std::string image = files.front();
_backgroundImage->setPixmap(QPixmap(QString::fromStdString(image)));
}
}
void LauncherWindow::populateProfilesList(std::string preset) {
@@ -462,7 +478,7 @@ std::string LauncherWindow::selectedWindowConfig() const {
int idx = _windowConfigBox->currentIndex();
if (idx == 0) {
return _sgctConfigName;
} else if (idx > _userAssetCount) {
} else if (idx > _userConfigCount) {
return "${CONFIG}/" + _windowConfigBox->currentText().toStdString();
}
else {

View File

@@ -129,10 +129,7 @@ AssetsDialog::AssetsDialog(openspace::Profile& profile, const std::string& asset
{
setWindowTitle("Assets");
_assetTreeModel.importModelData(assetBasePath, userAssetBasePath);
createWidgets();
}
void AssetsDialog::createWidgets() {
QBoxLayout* layout = new QVBoxLayout(this);
{
QLabel* heading = new QLabel("Select assets from /data/assets");
@@ -173,18 +170,19 @@ void AssetsDialog::createWidgets() {
nRows,
_assetTreeModel.index(-1, 0)
);
layout->addWidget(_assetTree);
layout->addWidget(_assetTree, 4);
}
{
QWidget* box = new QWidget;
QBoxLayout* boxLayout = new QVBoxLayout(box);
QLabel* summaryHeading = new QLabel("Selection summary");
summaryHeading->setObjectName("heading");
layout->addWidget(summaryHeading);
}
{
boxLayout->addWidget(summaryHeading);
_summary = new QTextEdit;
_summary->setReadOnly(true);
_summary->setText(createTextSummary());
layout->addWidget(_summary);
boxLayout->addWidget(_summary);
layout->addWidget(box, 1);
}
layout->addWidget(new Line);

View File

@@ -28,7 +28,6 @@
#include "profile/scriptlogdialog.h"
#include <openspace/scene/profile.h>
#include <openspace/util/keys.h>
#include <qevent.h>
#include <algorithm>
#include <QKeyEvent>
@@ -167,6 +166,10 @@ void KeybindingsDialog::createWidgets() {
_mapModKeyComboBoxIndexToKeyValue.push_back(modIdx++);
}
_keyModCombo->addItems(comboModKeysStringList);
connect(
_keyModCombo, QOverload<int>::of(&QComboBox::currentIndexChanged),
this, &KeybindingsDialog::keyModSelected
);
box->addWidget(_keyModCombo, 0, 1);
@@ -286,6 +289,7 @@ void KeybindingsDialog::createWidgets() {
void KeybindingsDialog::listItemSelected() {
QListWidgetItem *item = _list->currentItem();
int index = _list->row(item);
_currentKeybindingSelection = index;
if (_data.size() > 0) {
Profile::Keybinding& k = _data[index];
@@ -317,21 +321,25 @@ void KeybindingsDialog::listItemSelected() {
}
void KeybindingsDialog::keySelected(int index) {
const QString numKeyWarning = "Warning: Using a number key may conflict with the "
"keybindings for simulation time increments.";
_errorMsg->clear();
int selectedKey = _mapKeyComboBoxIndexToKeyValue[index];
checkForNumberKeyConflict(selectedKey);
checkForBindingConflict(_keyModCombo->currentIndex(), selectedKey);
}
void KeybindingsDialog::keyModSelected(int index) {
_errorMsg->clear();
int selectedKey = _mapModKeyComboBoxIndexToKeyValue[index];
checkForBindingConflict(selectedKey,
_mapKeyComboBoxIndexToKeyValue.at(_keyCombo->currentIndex()));
}
void KeybindingsDialog::addStringToErrorDisplay(const QString& newString) {
QString errorContents = _errorMsg->text();
bool alreadyContainsWarning = (errorContents.length() >= numKeyWarning.length() &&
errorContents.left(numKeyWarning.length()) == numKeyWarning);
if (_mapKeyComboBoxIndexToKeyValue[index] >= static_cast<int>(Key::Num0)
&& _mapKeyComboBoxIndexToKeyValue[index] <= static_cast<int>(Key::Num9))
{
if (!alreadyContainsWarning) {
errorContents = numKeyWarning + errorContents;
_errorMsg->setText(errorContents);
}
}
else if (alreadyContainsWarning) {
_errorMsg->setText(errorContents.mid(numKeyWarning.length()));
bool alreadyContainsString = (errorContents.indexOf(newString, 0) != -1);
if (!alreadyContainsString) {
errorContents = newString + errorContents;
_errorMsg->setText(errorContents);
}
}
@@ -371,10 +379,37 @@ void KeybindingsDialog::listItemAdded() {
_documentationEdit->setText(QString::fromStdString(_data.back().documentation));
_localCheck->setChecked(false);
_scriptEdit->setText(QString::fromStdString(_data.back().script));
_currentKeybindingSelection = static_cast<int>(_data.size() - 1);
_editModeNewItem = true;
}
void KeybindingsDialog::checkForNumberKeyConflict(int key) {
const QString numKeyWarning = "Warning: Using a number key may conflict with the "
"keybindings for simulation time increments.\n";
if (key >= static_cast<int>(Key::Num0) && key <= static_cast<int>(Key::Num9)) {
addStringToErrorDisplay(numKeyWarning);
}
}
void KeybindingsDialog::checkForBindingConflict(int selectedModKey, int selectedKey) {
const QString localWarn = "Warning: New selection conflicts with binding '";
if (_currentKeybindingSelection >= static_cast<int>(_data.size())) {
return;
}
KeyModifier newModifier = static_cast<KeyModifier>(selectedModKey);
Key newKey = static_cast<Key>(selectedKey);
for (int i = 0; i < static_cast<int>(_data.size()); ++i) {
if (i == _currentKeybindingSelection) {
continue;
}
openspace::Profile::Keybinding k = _data[i];
if ((k.key.key == newKey) && (k.key.modifier == newModifier)) {
addStringToErrorDisplay(localWarn + QString::fromStdString(k.name) + "'.\n");
break;
}
}
}
void KeybindingsDialog::listItemSave() {
if (!areRequiredFormsFilled()) {
return;

View File

@@ -4,7 +4,7 @@
<Display swapInterval="0" />
</Settings>
<Node address="localhost" port="20401">
<Window fullScreen="true" numberOfSamples="4" name="OpenSpace">
<Window fullscreen="true" numberOfSamples="4" name="OpenSpace">
<Stereo type="none" />
<Size x="1920" y="1080" />
<Pos x="0" y="0" />

View File

@@ -1,22 +1,16 @@
<?xml version="1.0" ?>
<Cluster masterAddress="localhost">
<Cluster masterAddress="localhost" externalControlPort="20500">
<Settings>
<Display swapInterval="0" />
</Settings>
<Node address="localhost" port="20401">
<Window fullScreen="false">
<Stereo type="none" />
<Pos x="200" y="300" />
<!-- 16:9 aspect ratio -->
<Size x="1280" y="360" />
<Viewport eye="left">
<Window fullScreen="false" name="OpenSpace">
<Stereo type="side_by_side" />
<Size x="1280" y="720" />
<Pos x="50" y="50" />
<Viewport tracked="true">
<Pos x="0.0" y="0.0" />
<Size x="0.5" y="1.0" />
<PlanarProjection>
<FOV down="25.267007923362" left="40.0" right="40.0" up="25.267007923362" />
<Orientation heading="0.0" pitch="0.0" roll="0.0" />
</PlanarProjection>
</Viewport>
<Viewport eye="right">
<Pos x="0.5" y="0.0" />
<Size x="0.5" y="1.0" />
<Size x="1.0" y="1.0" />
<PlanarProjection>
<FOV down="25.267007923362" left="40.0" right="40.0" up="25.267007923362" />
<Orientation heading="0.0" pitch="0.0" roll="0.0" />
@@ -24,7 +18,7 @@
</Viewport>
</Window>
</Node>
<User eyeSeparation="0.06">
<User eyeSeparation="0.065">
<Pos x="0.0" y="0.0" z="0.0" />
</User>
</Cluster>

View File

@@ -7,7 +7,7 @@
<Viewport name="Spout">
<Pos x="0.0" y="0.0" />
<Size x="1.0" y="1.0" />
<SpoutOutputProjection quality="1.5k">
<SpoutOutputProjection quality="1k" mappingSpoutName="OpenSpace">
<Background r="0.1" g="0.1" b="0.1" a="1.0" />
</SpoutOutputProjection>
</Viewport>

View File

@@ -0,0 +1,227 @@
local assetHelper = asset.require('util/asset_helper')
local sunTransforms = asset.require('scene/solarsystem/sun/transforms')
local transforms = asset.require('scene/solarsystem/planets/earth/transforms')
local model = asset.syncedResource({
Name = "Animated Box",
Type = "HttpSynchronization",
Identifier = "animated_box",
Version = 1
})
local StartTime = "2021 06 01 00:00:00";
local animationLoop = {
Identifier = "animationLoop",
Parent = transforms.EarthCenter.Identifier,
Transform = {
Translation = {
Type = "StaticTranslation",
Position = { 0.0, -11E7, 0.0 }
}
},
Renderable = {
Type = "RenderableModel",
GeometryFile = model .. "/BoxAnimated.glb",
EnableAnimation = true,
AnimationMode = "LoopFromStart",
AnimationStartTime = StartTime,
ModelScale = 3E7,
LightSources = {
{
Type = "SceneGraphLightSource",
Identifier = "Sun",
Node = sunTransforms.SolarSystemBarycenter.Identifier,
Intensity = 1.0
}
},
PerformShading = true,
DisableFaceCulling = true
},
GUI = {
Name = "Animated Model example (LoopFromStart)",
Path = "/Example",
Description = "Simple animated box model with the animation mode 'LoopFromStart'",
}
}
local animationLoopInf = {
Identifier = "animationLoopInf",
Parent = transforms.EarthCenter.Identifier,
Transform = {
Translation = {
Type = "StaticTranslation",
Position = { 0.0, 11E7, 0.0 }
}
},
Renderable = {
Type = "RenderableModel",
GeometryFile = model .. "/BoxAnimated.glb",
EnableAnimation = true,
AnimationMode = "LoopInfinitely",
AnimationStartTime = StartTime,
ModelScale = 3E7,
LightSources = {
{
Type = "SceneGraphLightSource",
Identifier = "Sun",
Node = sunTransforms.SolarSystemBarycenter.Identifier,
Intensity = 1.0
}
},
PerformShading = true,
DisableFaceCulling = true
},
GUI = {
Name = "Animated Model example (LoopInfinitely)",
Path = "/Example",
Description = "Simple animated box model with the animation mode 'LoopInfinitely'",
}
}
local animationOnce = {
Identifier = "animationOnce",
Parent = transforms.EarthCenter.Identifier,
Transform = {
Translation = {
Type = "StaticTranslation",
Position = { 11E7, 0.0, 0.0 }
}
},
Renderable = {
Type = "RenderableModel",
GeometryFile = model .. "/BoxAnimated.glb",
EnableAnimation = true,
AnimationMode = "Once",
AnimationStartTime = StartTime,
ModelScale = 3E7,
LightSources = {
{
Type = "SceneGraphLightSource",
Identifier = "Sun",
Node = sunTransforms.SolarSystemBarycenter.Identifier,
Intensity = 1.0
}
},
PerformShading = true,
DisableFaceCulling = true
},
GUI = {
Name = "Animated Model example (Once)",
Path = "/Example",
Description = "Simple animated box model with the animation mode 'Once'",
}
}
local animationBounceInf = {
Identifier = "animationBounceInf",
Parent = transforms.EarthCenter.Identifier,
Transform = {
Translation = {
Type = "StaticTranslation",
Position = { 0.0, 0.0, 11E7 }
}
},
Renderable = {
Type = "RenderableModel",
GeometryFile = model .. "/BoxAnimated.glb",
EnableAnimation = true,
AnimationMode = "BounceInfinitely",
AnimationStartTime = StartTime,
ModelScale = 3E7,
LightSources = {
{
Type = "SceneGraphLightSource",
Identifier = "Sun",
Node = sunTransforms.SolarSystemBarycenter.Identifier,
Intensity = 1.0
}
},
PerformShading = true,
DisableFaceCulling = true
},
GUI = {
Name = "Animated Model example (BounceInfinitely)",
Path = "/Example",
Description = "Simple animated box model with the animation mode 'BounceInfinitely'",
}
}
local animationBounce = {
Identifier = "animationBounce",
Parent = transforms.EarthCenter.Identifier,
Transform = {
Translation = {
Type = "StaticTranslation",
Position = { 0.0, 0.0, -11E7 }
}
},
Renderable = {
Type = "RenderableModel",
GeometryFile = model .. "/BoxAnimated.glb",
EnableAnimation = true,
AnimationMode = "BounceFromStart",
AnimationStartTime = StartTime,
ModelScale = 3E7,
LightSources = {
{
Type = "SceneGraphLightSource",
Identifier = "Sun",
Node = sunTransforms.SolarSystemBarycenter.Identifier,
Intensity = 1.0
}
},
PerformShading = true,
DisableFaceCulling = true
},
GUI = {
Name = "Animated Model example (BounceFromStart)",
Path = "/Example",
Description = "Simple animated box model with the animation mode 'BounceFromStart'",
}
}
assetHelper.registerSceneGraphNodesAndExport(asset, {
animationLoop,
animationLoopInf,
animationOnce,
animationBounceInf,
animationBounce
})
-- Asset
asset.meta = {
Name = "Animation Example asset",
Version = "1.0",
Description = "Simple animation example asset with an animated box model",
Author = "OpenSpace Team",
URL = "http://openspaceproject.com",
License = "MIT license",
Identifiers = {
"animationLoop",
"animationLoopInf",
"animationOnce",
"animationBounceInf",
"animationBounce"
}
}
-- Model
asset.meta = {
Name = "Animated Box Model",
Version = "1.0",
Description = "Simple animated box model",
Author = "Cesium, https://cesium.com/",
URL = "https://github.com/KhronosGroup/glTF-Sample-Models/tree/master/2.0/BoxAnimated",
License = [[
Creative Commons Attribution 4.0 International License,
https://creativecommons.org/licenses/by/4.0/
]],
Identifiers = {
"animationLoop",
"animationLoopInf",
"animationOnce",
"animationBounceInf",
"animationBounce"
}
}

View File

@@ -2,19 +2,11 @@ local assetHelper = asset.require('util/asset_helper')
local earth = asset.require('scene/solarsystem/planets/earth/earth')
local sunTransforms = asset.require('scene/solarsystem/sun/transforms')
local textures = asset.syncedResource({
Name = "New Horizons Textures",
Type = "HttpSynchronization",
Identifier = "newhorizons_textures",
Version = 3
})
local models = asset.syncedResource({
Name = "New Horizons Model",
Type = "HttpSynchronization",
Identifier = "newhorizons_model",
Version = 1
Version = 2
})
local Example_Fixed_Height = {
@@ -24,19 +16,15 @@ local Example_Fixed_Height = {
Translation = {
Type = "GlobeTranslation",
Globe = earth.Earth.Identifier,
Longitude = 0.0,
Latitude = 0.0,
FixedAltitude = 10000000.0
Longitude = -74.006,
Latitude = 40.7128,
Altitude = 100000.0
}
},
Renderable = {
Type = "RenderableModel",
Body = "NEW HORIZONS",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/NewHorizonsCleanModel.obj",
ColorTexture = textures .. "/NHTexture.jpg"
}}
GeometryFile = models .. "/NewHorizonsCleanModel.obj"
},
GUI = {
Path = "/Example"
@@ -51,17 +39,14 @@ local Example_Adaptive_Height = {
Type = "GlobeTranslation",
Globe = earth.Earth.Identifier,
Longitude = -74.006,
Latitude = 40.7128
Latitude = 40.7128,
UseHeightmap = true
}
},
Renderable = {
Type = "RenderableModel",
Body = "NEW HORIZONS",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/NewHorizonsCleanModel.obj",
ColorTexture = textures .. "/NHTexture.jpg"
}}
GeometryFile = models .. "/NewHorizonsCleanModel.obj"
},
GUI = {
Path = "/Example"

View File

@@ -0,0 +1,47 @@
local stateMachineHelper = asset.require('util/lua_state_machine_helper')
local states = {
{
Title = "Highlight EarthTrail",
Play = function ()
openspace.setPropertyValue("Scene.EarthTrail.Renderable.Appearance.LineWidth", 10, 1)
end,
Rewind = function ()
openspace.setPropertyValue("Scene.EarthTrail.Renderable.Appearance.LineWidth", 2, 1)
end
},
{
Title = "Highlight MarsTrail",
Play = function ()
openspace.setPropertyValue("Scene.EarthTrail.Renderable.Appearance.LineWidth", 2, 1)
openspace.setPropertyValue("Scene.MarsTrail.Renderable.Appearance.LineWidth", 10, 1)
end,
Rewind = function ()
openspace.setPropertyValue("Scene.MarsTrail.Renderable.Appearance.LineWidth", 2, 1)
openspace.setPropertyValue("Scene.EarthTrail.Renderable.Appearance.LineWidth", 10, 1)
end
}
}
local stateMachine
function next()
stateMachine.goToNextState()
end
function previous()
stateMachine.goToPreviousState()
end
asset.onInitialize(function ()
stateMachine = stateMachineHelper.createStateMachine(states)
openspace.bindKey('RIGHT', 'next()')
openspace.bindKey('LEFT', 'previous()')
end)
asset.onDeinitialize(function ()
stateMachine = nil
openspace.clearKey('RIGHT')
openspace.clearKey('LEFT')
end)

View File

@@ -15,17 +15,18 @@ asset.onInitialize(function ()
local interpolationDuration = 0.5
function nextSlide()
helper.goToNextSlide(deck, interpolationDuration)
end
-- Add global functions for controlling slide deck and bind to keys
rawset(_G, "nextSlide", function()
helper.goToNextSlide(deck, interpolationDuration)
end)
function previousSlide()
rawset(_G, "previousSlide", function()
helper.goToPreviousSlide(deck, interpolationDuration)
end
end)
function toggleSlides()
rawset(_G, "toggleSlides", function()
helper.toggleSlides(deck, interpolationDuration)
end
end)
helper.setCurrentSlide(deck, 1)
openspace.bindKey("KP_6", "nextSlide()", "Next slide", "Next slide", "/Slides")

View File

@@ -1,47 +1,75 @@
local stateMachineHelper = asset.require('util/state_machine_helper')
-- Create a state machine with a few different states. The state machine can be controlled through
-- the scripting commands from the state machine module.
states = {
local targetNode = function(nodeIdentifier)
return [[
openspace.setPropertyValueSingle("NavigationHandler.OrbitalNavigator.RetargetAnchor", nil)
openspace.setPropertyValueSingle(
"NavigationHandler.OrbitalNavigator.Anchor",
']] .. nodeIdentifier .. [['
)
openspace.setPropertyValueSingle("NavigationHandler.OrbitalNavigator.Aim", '')
]]
end
local states = {
{
Title = "Highlight EarthTrail",
Play = function ()
openspace.setPropertyValue("Scene.EarthTrail.Renderable.LineWidth", 10, 1)
end,
Rewind = function ()
openspace.setPropertyValue("Scene.EarthTrail.Renderable.LineWidth", 2, 1)
end
},
Identifier = "Constellations",
Enter = [[
openspace.setPropertyValueSingle('Scene.Constellations.Renderable.Opacity', 1.0, 1.0)
]],
Exit = [[
openspace.setPropertyValueSingle('Scene.Constellations.Renderable.Opacity', 0.0, 1.0)
]]
},
{
Title = "Highlight MarsTrail",
Play = function ()
openspace.setPropertyValue("Scene.EarthTrail.Renderable.LineWidth", 2, 1)
openspace.setPropertyValue("Scene.MarsTrail.Renderable.LineWidth", 10, 1)
end,
Rewind = function ()
openspace.setPropertyValue("Scene.MarsTrail.Renderable.LineWidth", 2, 1)
openspace.setPropertyValue("Scene.EarthTrail.Renderable.LineWidth", 10, 1)
end
Identifier = "Earth",
Enter = "openspace.setPropertyValueSingle('Scene.EarthLabel.Renderable.Enabled', true)",
Exit = "openspace.setPropertyValueSingle('Scene.EarthLabel.Renderable.Enabled', false)"
},
{
Identifier = "Moon",
Enter = "",
Exit = ""
}
}
local stateMachine
local transitions = {
{
From = "Earth",
To = "Moon",
Action = targetNode("Moon")
},
{
From = "Moon",
To = "Earth",
Action = targetNode("Earth")
},
{
From = "Earth",
To = "Constellations",
-- action is optional
},
{
From = "Constellations",
To = "Earth"
},
{
From = "Moon",
To = "Constellations",
Action = targetNode("Earth")
},
{
From = "Constellations",
To = "Moon",
Action = targetNode("Moon")
}
}
function next()
stateMachine.goToNextState()
end
asset.onInitialize(function()
-- Setup
openspace.setPropertyValueSingle('Scene.Constellations.Renderable.Enabled', true)
openspace.setPropertyValueSingle('Scene.Constellations.Renderable.Opacity', 0.0)
function previous()
stateMachine.goToPreviousState()
end
asset.onInitialize(function ()
stateMachine = stateMachineHelper.createStateMachine(states)
openspace.bindKey('RIGHT', 'next()')
openspace.bindKey('LEFT', 'previous()')
end)
asset.onDeinitialize(function ()
stateMachine = nil
openspace.clearKey('RIGHT')
openspace.clearKey('LEFT')
openspace.statemachine.createStateMachine(states, transitions, "Earth")
end)

View File

@@ -2,7 +2,7 @@ local models = asset.syncedResource({
Name = "Apollo Boulders Models",
Type = "HttpSynchronization",
Identifier = "apollo_boulders",
Version = 1
Version = 2
})
asset.export('models', models)

View File

@@ -47,11 +47,7 @@ local Station2Boulder1Model = {
},
Renderable = {
Type = "RenderableModel",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/b1-v2.obj",
ColorTexture = models .. "/b1-v2_u1_v1.jpeg"
}},
GeometryFile = models .. "/b1-v2.obj",
RotationVector = { 243.243256 ,206.270264, 309.677429 },
LightSources = LightSources,
PerformShading = false,
@@ -93,11 +89,7 @@ local Station2Boulder2Model = {
},
Renderable = {
Type = "RenderableModel",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/b2model.obj",
ColorTexture = models .. "/b2model_u1_v1.jpeg"
}},
GeometryFile = models .. "/b2model.obj",
RotationVector = { 66.162155, 7.783780, 114.193550 },
LightSources = LightSources,
PerformShading = false,
@@ -139,11 +131,7 @@ local Station2Boulder3Model = {
},
Renderable = {
Type = "RenderableModel",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/b3model.obj",
ColorTexture = models .. "/b3model_u1_v1.jpeg"
}},
GeometryFile = models .. "/b3model.obj",
RotationVector = { 161.513519 ,243.243256, 65.806450 },
LightSources = LightSources,
PerformShading = false,

View File

@@ -58,11 +58,7 @@ local Station6Frag1Model = {
},
Renderable = {
Type = "RenderableModel",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/A17-S6-frag1.obj",
ColorTexture = models .. "/A17-S6-frag1.png"
}},
GeometryFile = models .. "/A17-S6-frag1.obj",
RotationVector = { 235.909088,165.000000,286.299194 },
LightSources = LightSources,
PerformShading = false,
@@ -105,11 +101,7 @@ local Station6Frag2Model = {
},
Renderable = {
Type = "RenderableModel",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/station6_boulder_frag2.obj",
ColorTexture = models .. "/frag2crop_u1_v1.jpeg"
}},
GeometryFile = models .. "/station6_boulder_frag2.obj",
RotationVector = { 336.959991,210.239990,325.984253 },
LightSources = LightSources,
PerformShading = false,
@@ -140,11 +132,7 @@ local Station6Frag3Model = {
},
Renderable = {
Type = "RenderableModel",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/station6_boulder_frag3.obj",
ColorTexture = models .. "/frag3crop_u1_v1.jpeg"
}},
GeometryFile = models .. "/station6_boulder_frag3.obj",
RotationVector = { 293.181824,255.000000,4.090910 },
LightSources = LightSources,
PerformShading = false,

View File

@@ -47,11 +47,7 @@ local Station7BoulderModel = {
},
Renderable = {
Type = "RenderableModel",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = models .. "/b7model.obj",
ColorTexture = models .. "/b7model_u1_v1.jpeg"
}},
GeometryFile = models .. "/b7model.obj",
RotationVector = { 1.945950,274.378387,212.903214 },
LightSources = LightSources,
PerformShading = false,

View File

@@ -2,7 +2,7 @@ local models = asset.syncedResource({
Name = "Apollo Models",
Type = "HttpSynchronization",
Identifier = "apollo_models",
Version = 3
Version = 4
})
asset.export('models', models)

View File

@@ -20,7 +20,7 @@ local Gaia = {
XAxis = { 1.0, 0.0, 0.0 },
XAxisOrthogonal = true,
YAxis = "Sun",
YAxisInverted = true
YAxisInvert = true
},
Scale = {
Type = "StaticScale",

View File

@@ -11,7 +11,7 @@ local trail = asset.syncedResource({
local GaiaTrail = {
Identifier = "GaiaTrail",
Parent = earthTransforms.EarthBarycenter.Identifier,
Parent = earthTransforms.EarthCenter.Identifier,
Renderable = {
Type = "RenderableTrailTrajectory",
Translation = {
@@ -36,7 +36,7 @@ local GaiaTrail = {
local GaiaTrailEclip = {
Identifier = "GaiaTrail_Eclip",
Parent = sunTransforms.SolarSystemBarycenter.Identifier,
Parent = sunTransforms.SunCenter.Identifier,
Renderable = {
Type = "RenderableTrailTrajectory",
Enabled = false,

View File

@@ -11,7 +11,7 @@ local trail = asset.syncedResource({
local GaiaPosition = {
Identifier = "GaiaPosition",
Parent = earthTransforms.EarthBarycenter.Identifier,
Parent = earthTransforms.EarthCenter.Identifier,
Transform = {
Translation = {
Type = "HorizonsTranslation",

View File

@@ -23,7 +23,7 @@ local model = asset.syncedResource({
Name = "JWST Model",
Type = "HttpSynchronization",
Identifier = "jwst_model",
Version = 1
Version = 2
})
local band = asset.syncedResource({
@@ -111,12 +111,11 @@ local JWSTModel = {
},
Renderable = {
Type = "RenderableModel",
GeometryFile = model .. "/JWSTFBX.osmodel",
GeometryFile = model .. "/JWST.osmodel",
ModelScale = "Foot",
InvertModelScale = true,
EnableAnimation = true,
--TODO: Update theese when the new animation is finished
AnimationStartTime = "2018 10 01 15:00:00",
AnimationStartTime = "2018 10 01 14:05:52",
AnimationMode = "Once",
LightSources = {
{
@@ -130,7 +129,7 @@ local JWSTModel = {
DisableFaceCulling = true
},
GUI = {
Name = "James Webb Space Telescope",
Name = "James Webb Space Telescope Model",
Path = "/Solar System/Missions/JWST",
}
}
@@ -226,7 +225,7 @@ local JWSTLaunchModel = {
Parent = JWSTLaunchPosition.Identifier,
TimeFrame = {
Type = "TimeFrameInterval",
Start = "2018 OCT 01 13:18:00",
Start = "2018 OCT 01 14:05:52",
End = "2019 OCT 01"
},
Transform = {
@@ -241,12 +240,11 @@ local JWSTLaunchModel = {
},
Renderable = {
Type = "RenderableModel",
GeometryFile = model .. "/JWSTFBX.osmodel",
GeometryFile = model .. "/JWST.osmodel",
ModelScale = "Foot",
InvertModelScale = true,
EnableAnimation = true,
--TODO: Update theese when the new animation is finished
AnimationStartTime = "2018 10 01 15:00:00",
AnimationStartTime = "2018 10 01 14:05:52",
AnimationMode = "Once",
LightSources = {
{

View File

@@ -34,7 +34,7 @@ local BennuProjection = {
Enabled = true,
Type = "RenderableModelProjection",
Body = BENNU_BODY,
GeometryFile = models .. "/BennuTextured.obj",
GeometryFile = models .. "/Bennu_v20_200k_an.obj",
Projection = {
Sequence = { images, imagesA },
SequenceType = "image-sequence",

View File

@@ -130,34 +130,36 @@ local PolyCamFov = {
}
}
local RexisFov = {
Identifier = "REXIS FOV",
Parent = Rexis.Identifier,
Renderable = {
Type = "RenderableFov",
Body = "OSIRIS-REX",
Frame = "ORX_REXIS",
RGB = { 0.8, 0.7, 0.7 },
Instrument = {
Name = "ORX_REXIS",
Method = "ELLIPSOID",
Aberration = "NONE"
},
PotentialTargets = { BENNU_BODY },
FrameConversions = {
[BENNU_BODY] = "IAU_BENNU"
}
},
GUI = {
Name = "REXIS FOV",
Path = "/Solar System/Missions/OSIRIS REx/Instruments"
}
}
-- Commenting this out as REXIS' shape is circle, which is currently not supported in
-- the RenderableFOV class
-- local RexisFov = {
-- Identifier = "REXIS FOV",
-- Parent = Rexis.Identifier,
-- Renderable = {
-- Type = "RenderableFov",
-- Body = "OSIRIS-REX",
-- Frame = "ORX_REXIS",
-- RGB = { 0.8, 0.7, 0.7 },
-- Instrument = {
-- Name = "ORX_REXIS",
-- Method = "ELLIPSOID",
-- Aberration = "NONE"
-- },
-- PotentialTargets = { BENNU_BODY },
-- FrameConversions = {
-- [BENNU_BODY] = "IAU_BENNU"
-- }
-- },
-- GUI = {
-- Name = "REXIS FOV",
-- Path = "/Solar System/Missions/OSIRIS REx/Instruments"
-- }
-- }
assetHelper.registerSceneGraphNodesAndExport(asset, {
OsirisRex,
PolyCam,
Rexis,
PolyCamFov,
RexisFov
-- RexisFov
})

View File

@@ -5,16 +5,12 @@ local modelFolder = asset.syncedResource({
Name = "Pioneer 10/11 Models",
Type = "HttpSynchronization",
Identifier = "pioneer_10_11_model",
Version = 2
Version = 3
})
local ModelRenderable = {
Type = "RenderableModel",
Geometry = {{
Type = "MultiModelGeometry",
GeometryFile = modelFolder .. "/Pioneer.obj",
ColorTexture = modelFolder .. "/gray.png"
}},
GeometryFile = modelFolder .. "/pioneer.fbx",
LightSources = assetHelper.getDefaultLightSources(
sunTransforms.SolarSystemBarycenter.Identifier
)

View File

@@ -10,7 +10,7 @@ local layer = {
"Yesterday",
"1d",
"1km",
"jpg"
"png"
),
Description = [[ Temporal coverage: 01 June 2002 - Present. The imagery resolution
is 1 km, and the temporal resolution is daily.]]

View File

@@ -42,7 +42,6 @@ local Atmosphere = {
G = 0.85
},
Debug = {
-- PreCalculatedTextureScale is a float from 1.0 to N, with N > 0.0 and N in Naturals (i.e., 1, 2, 3, 4, 5....)
PreCalculatedTextureScale = 1.0,
SaveCalculatedTextures = false
}

View File

@@ -42,7 +42,6 @@ local Atmosphere = {
G = 0.85
},
Debug = {
-- PreCalculatedTextureScale is a float from 1.0 to N, with N > 0.0 and N in Naturals (i.e., 1, 2, 3, 4, 5....)
PreCalculatedTextureScale = 1.0,
SaveCalculatedTextures = false
}

View File

@@ -20,6 +20,24 @@ local SolarSystemBarycenter = {
}
}
local SunCenter = {
Identifier = "SunCenter",
Parent = SolarSystemBarycenter.Identifier,
Transform = {
Translation = {
Type = "SpiceTranslation",
Target = "SUN",
Observer = "SSB"
}
},
GUI = {
Name = "SUN Center",
Path = "/Solar System/Sun",
Description = [[Spice frame for the Sun]],
Hidden = true
}
}
-- Spice frame for the Sun
local SunIAU = {
Identifier = "SunIAU",
@@ -67,7 +85,7 @@ local SunECLIPJ2000 = {
}
}
assetHelper.registerSceneGraphNodesAndExport(asset, { SolarSystemBarycenter, SunIAU, SunECLIPJ2000 })
assetHelper.registerSceneGraphNodesAndExport(asset, { SolarSystemBarycenter, SunCenter, SunIAU, SunECLIPJ2000 })
asset.meta = {

View File

@@ -67,7 +67,7 @@ local addCartesianAxes = function (specification)
Parent = parent,
Transform = {
Scale = {
Type = "StaticScale",
Type = "NonUniformStaticScale",
Scale = scale
},
Translation = {

View File

@@ -2,6 +2,6 @@ local DataPath = asset.syncedResource({
Name = "Launcher Images",
Type = "HttpSynchronization",
Identifier = "launcher_images",
Version = 1
Version = 2
})
asset.export("DataPath", DataPath)

View File

@@ -1,3 +1,8 @@
-- Contains the required functions to create a simple Lua state machine, that can step
-- forwards and backwards through a list of states.
--
-- A state is given as a table with a Title string, and two functions: Play and Rewind
-- (see example asset)
local goToNextStateFunction = function (machine)
if (machine.currentStateIndex >= #machine.states) then

View File

@@ -3,7 +3,7 @@ asset.require('./static_server')
local guiCustomization = asset.require('customization/gui')
-- Select which commit hashes to use for the frontend and backend
local frontendHash = "96b88e6c760e59d143bd29da6f06011eaafce4b1"
local frontendHash = "829260614bb95e236d23cb500f6ec0fb2e3bdf51"
local dataProvider = "data.openspaceproject.com/files/webgui"
local frontend = asset.syncedResource({

Binary file not shown.

After

Width:  |  Height:  |  Size: 61 KiB

View File

@@ -54,6 +54,7 @@ set_folder_location(GhoulTest "Unit Tests")
# Spice
begin_dependency("Spice")
set(SPICE_BUILD_SHARED_LIBRARY OFF CACHE BOOL "" FORCE)
add_subdirectory(spice)
set_folder_location(spice "External")
end_dependency()

View File

@@ -53,6 +53,15 @@ struct Configuration {
};
std::map<std::string, std::string> fonts;
struct FontSizes {
float frameInfo;
float shutdown;
float log;
float cameraInfo;
float versionInfo;
};
FontSizes fontSize;
struct Logging {
std::string level = "Info";
bool forceImmediateFlush = false;
@@ -89,18 +98,17 @@ struct Configuration {
bool usePerProfileCache = false;
bool isRenderingOnMasterDisabled = false;
glm::dvec3 globalRotation = glm::dvec3(0.0);
glm::dvec3 screenSpaceRotation = glm::dvec3(0.0);
glm::dvec3 masterRotation = glm::dvec3(0.0);
glm::vec3 globalRotation = glm::vec3(0.0);
glm::vec3 screenSpaceRotation = glm::vec3(0.0);
glm::vec3 masterRotation = glm::vec3(0.0);
bool isConsoleDisabled = false;
bool bypassLauncher = false;
std::map<std::string, ghoul::Dictionary> moduleConfigurations;
std::string renderingMethod = "Framebuffer";
struct OpenGLDebugContext {
bool isActive = false;
bool printStacktrace = false;
bool isSynchronous = true;
struct IdentifierFilter {
std::string type;

View File

@@ -125,7 +125,7 @@ private:
bool _hasScheduledAssetLoading = false;
std::string _scheduledAssetPathToLoad;
glm::vec2 _mousePosition;
glm::vec2 _mousePosition = glm::vec2(0.f);
//grabs json from each module to pass to the documentation engine.
std::string _documentationJson;

View File

@@ -145,6 +145,19 @@ public:
*/
std::chrono::steady_clock::time_point currentPlaybackInterpolationTime() const;
/**
* Returns the simulated application time. This simulated application time is only
* used when playback is set to be in the mode where a screenshot is captured with
* every rendered frame (enableTakeScreenShotDuringPlayback() is used to enable this
* mode). At the start of playback, this timer is set to the value of the current
* applicationTime function provided by the window delegate (used during normal
* mode or playback). However, during playback it is incremented by the fixed
* framerate of the playback rather than the actual clock value.
*
* \returns application time in seconds, for use in playback-with-frames mode
*/
double currentApplicationInterpolationTime() const;
/**
* Starts a recording session, which will save data to the provided filename
* according to the data format specified, and will continue until recording is
@@ -575,6 +588,7 @@ public:
protected:
properties::BoolProperty _renderPlaybackInformation;
properties::BoolProperty _ignoreRecordedScale;
enum class RecordedType {
Camera = 0,
@@ -718,6 +732,7 @@ protected:
double _saveRenderingCurrentRecordedTime;
std::chrono::steady_clock::duration _saveRenderingDeltaTime_interpolation_usec;
std::chrono::steady_clock::time_point _saveRenderingCurrentRecordedTime_interpolation;
double _saveRenderingCurrentApplicationTime_interpolation;
long long _saveRenderingClockInterpolation_countsPerSec;
bool _saveRendering_isFirstFrame = true;

View File

@@ -23,9 +23,11 @@
****************************************************************************************/
#include <openspace/util/json_helper.h>
#include <ghoul/logging/logmanager.h>
#include <ghoul/lua/ghoul_lua.h>
#include <glm/ext/matrix_common.hpp>
#include <cmath>
#include <type_traits>
namespace openspace::properties {
@@ -90,6 +92,35 @@ float NumericalProperty<T>::exponent() const {
template <typename T>
void NumericalProperty<T>::setExponent(float exponent) {
ghoul_assert(std::abs(exponent) > 0.f, "Exponent for property input cannot be zero");
auto isValidRange = [](const T& minValue, const T& maxValue) {
if constexpr (ghoul::isGlmVector<T>() || ghoul::isGlmMatrix<T>()) {
return glm::all(glm::greaterThanEqual(minValue, T(0))) &&
glm::all(glm::greaterThanEqual(maxValue, T(0)));
}
else {
return (minValue >= T(0) && maxValue >= T(0));
}
};
// While the exponential slider does not support ranges with negative values,
// prevent setting the exponent for such ranges
// @ TODO (2021-06-30, emmbr), remove this check when no longer needed
if (!std::is_unsigned<T>::value) {
if (!isValidRange(_minimumValue, _maximumValue)) {
LWARNINGC(
"NumericalProperty: setExponent",
fmt::format(
"Setting exponent for properties with negative values in "
"[min, max] range is not yet supported. Property: {}",
this->fullyQualifiedIdentifier()
)
);
_exponent = 1.f;
return;
}
}
_exponent = exponent;
}

View File

@@ -1,183 +0,0 @@
/*****************************************************************************************
* *
* OpenSpace *
* *
* Copyright (c) 2014-2021 *
* *
* Permission is hereby granted, free of charge, to any person obtaining a copy of this *
* software and associated documentation files (the "Software"), to deal in the Software *
* without restriction, including without limitation the rights to use, copy, modify, *
* merge, publish, distribute, sublicense, and/or sell copies of the Software, and to *
* permit persons to whom the Software is furnished to do so, subject to the following *
* conditions: *
* *
* The above copyright notice and this permission notice shall be included in all copies *
* or substantial portions of the Software. *
* *
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, *
* INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A *
* PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT *
* HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF *
* CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE *
* OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. *
****************************************************************************************/
#ifndef __OPENSPACE_CORE___ABUFFERRENDERER___H__
#define __OPENSPACE_CORE___ABUFFERRENDERER___H__
#ifdef OPENSPACE_WITH_ABUFFER_RENDERER
#include <openspace/rendering/renderer.h>
#include <openspace/rendering/raycasterlistener.h>
#include <ghoul/glm.h>
#include <ghoul/misc/dictionary.h>
#include <ghoul/opengl/ghoul_gl.h>
#include <map>
#include <memory>
#include <string>
#include <vector>
namespace ghoul::filesystem { class File; }
namespace ghoul::opengl {
class ProgramObject;
class Texture;
} // namespace ghoul::opengl
namespace openspace {
struct RaycasterTask;
class RenderableVolume;
class Camera;
class Scene;
struct RaycastData;
class ABufferRenderer : public Renderer, public RaycasterListener {
public:
virtual ~ABufferRenderer() = default;
void initialize() override;
void deinitialize() override;
void setResolution(glm::ivec2 res) override;
void setNAaSamples(int nAaSamples) override;
void setBlurrinessLevel(int level) override;
void setHDRExposure(float hdrExposure) override;
void setGamma(float gamma) override;
void setMaxWhite(float maxWhite) override;
void setToneMapOperator(int tmOp) override;
void setBloomThreMin(float minV) override;
void setBloomThreMax(float maxV) override;
void setBloomOrigFactor(float origFactor) override;
void setBloomNewFactor(float newFactor) override;
void setKey(float key) override;
void setYwhite(float white) override;
void setTmoSaturation(float sat) override;
void setHue(float hue) override;
void setValue(float value) override;
void setSaturation(float sat) override;
void setLightness(float lightness) override;
void setColorSpace(unsigned int colorspace) override;
void enableBloom(bool enable) override;
void enableHistogram(bool enable) override;
int nAaSamples() const override;
const std::vector<double>& mSSAPattern() const override;
using Renderer::preRaycast;
void preRaycast(const RaycasterTask& raycasterTask);
using Renderer::postRaycast;
void postRaycast(const RaycasterTask& raycasterTask);
void update() override;
void render(Scene* scene, Camera* camera, float blackoutFactor) override;
/**
* Update render data
* Responsible for calling renderEngine::setRenderData
*/
virtual void updateRendererData() override;
virtual void raycastersChanged(VolumeRaycaster& raycaster,
IsAttached attached) override;
private:
void clear();
void updateResolution();
void updateRaycastData();
void updateResolveDictionary();
void updateMSAASamplingPattern();
void saveTextureToMemory(GLenum color_buffer_attachment, int width, int height,
std::vector<double> & memory) const;
glm::ivec2 _resolution = glm::ivec2(0);
bool _dirtyResolution = true;
bool _dirtyRendererData = true;
bool _dirtyRaycastData = true;
bool _dirtyResolveDictionary = true;
std::unique_ptr<ghoul::opengl::ProgramObject> _resolveProgram = nullptr;
/**
* When a volume is attached or detached from the scene graph,
* the resolve program needs to be recompiled.
* The _volumes map keeps track of which volumes that can
* be rendered using the current resolve program, along with their raycast data
* (id, namespace, etc)
*/
std::map<VolumeRaycaster*, RaycastData> _raycastData;
std::map<
VolumeRaycaster*, std::unique_ptr<ghoul::opengl::ProgramObject>
> _boundsPrograms;
std::vector<std::string> _helperPaths;
ghoul::Dictionary _resolveDictionary;
GLuint _mainColorTexture;
GLuint _mainDepthTexture;
GLuint _mainFramebuffer;
GLuint _screenQuad;
GLuint _anchorPointerTexture;
GLuint _anchorPointerTextureInitializer;
GLuint _atomicCounterBuffer;
GLuint _fragmentBuffer;
GLuint _fragmentTexture;
GLuint _vertexPositionBuffer;
int _nAaSamples;
int _blurrinessLevel = 1;
float _hdrExposure = 0.4f;
float _hdrBackground = 2.8f;
float _gamma = 2.2f;
float _maxWhite = 1.f;
float _blackoutFactor;
bool _bloomEnabled = false;
float _bloomThresholdMin = 0.0;
float _bloomThresholdMax = 1.0;
float _bloomOrigFactor = 1.0;
float _bloomNewFactor = 1.0;
int _toneMapOperator = 0;
bool _histogramEnabled = false;
int _numberOfBins = 1024; // JCC TODO: Add a parameter control for this.
float _tmoKey = 0.18f;
float _tmoYwhite = 1e6f;
float _tmoSaturation = 1.0f;
float _hue = 1.f;
float _saturation = 1.f;
float _value = 1.f;
float _lightness = 1.f;
unsigned int _colorSpace = 1;
std::vector<double> _mSAAPattern;
ghoul::Dictionary _rendererData;
};
} // namespace openspace
#endif // OPENSPACE_WITH_ABUFFER_RENDERER
#endif // __OPENSPACE_CORE___ABUFFERRENDERER___H__

View File

@@ -51,8 +51,6 @@ public:
const DeferredcastData& /*deferredData*/,
ghoul::opengl::ProgramObject& /*program*/) {};
virtual std::filesystem::path deferredcastPath() const = 0;
virtual std::filesystem::path deferredcastVSPath() const = 0;
virtual std::filesystem::path deferredcastFSPath() const = 0;

View File

@@ -25,8 +25,6 @@
#ifndef __OPENSPACE_CORE___FRAMEBUFFERRENDERER___H__
#define __OPENSPACE_CORE___FRAMEBUFFERRENDERER___H__
#include <openspace/rendering/renderer.h>
#include <openspace/rendering/renderengine.h>
#include <openspace/rendering/raycasterlistener.h>
#include <openspace/rendering/deferredcasterlistener.h>
@@ -56,14 +54,14 @@ struct RaycasterTask;
class Scene;
struct UpdateStructures;
class FramebufferRenderer : public Renderer, public RaycasterListener,
class FramebufferRenderer : public RaycasterListener,
public DeferredcasterListener
{
public:
virtual ~FramebufferRenderer() = default;
void initialize() override;
void deinitialize() override;
void initialize();
void deinitialize();
void updateResolution();
void updateRaycastData();
@@ -72,33 +70,33 @@ public:
void updateFXAA();
void updateDownscaledVolume();
void setResolution(glm::ivec2 res) override;
void setHDRExposure(float hdrExposure) override;
void setGamma(float gamma) override;
void setHue(float hue) override;
void setValue(float value) override;
void setSaturation(float sat) override;
void setResolution(glm::ivec2 res);
void setHDRExposure(float hdrExposure);
void setGamma(float gamma);
void setHue(float hue);
void setValue(float value);
void setSaturation(float sat);
void enableFXAA(bool enable) override;
void setDisableHDR(bool disable) override;
void enableFXAA(bool enable);
void setDisableHDR(bool disable);
void update() override;
void update();
void performRaycasterTasks(const std::vector<RaycasterTask>& tasks,
const glm::ivec4& viewport);
void performDeferredTasks(const std::vector<DeferredcasterTask>& tasks,
const glm::ivec4& viewport);
void render(Scene* scene, Camera* camera, float blackoutFactor) override;
void render(Scene* scene, Camera* camera, float blackoutFactor);
/**
* Update render data
* Responsible for calling renderEngine::setRenderData
*/
virtual void updateRendererData() override;
virtual void updateRendererData();
virtual void raycastersChanged(VolumeRaycaster& raycaster,
RaycasterListener::IsAttached attached) override;
RaycasterListener::IsAttached attached);
virtual void deferredcastersChanged(Deferredcaster& deferredcaster,
DeferredcasterListener::IsAttached isAttached) override;
DeferredcasterListener::IsAttached isAttached);
private:
using RaycasterProgObjMap = std::map<

View File

@@ -33,6 +33,7 @@
#include <openspace/properties/scalar/floatproperty.h>
#include <openspace/properties/vector/vec3property.h>
#include <openspace/properties/triggerproperty.h>
#include <openspace/rendering/framebufferrenderer.h>
#include <chrono>
#include <filesystem>
@@ -54,7 +55,6 @@ namespace scripting { struct LuaLibrary; }
class Camera;
class RaycasterManager;
class DeferredcasterManager;
class Renderer;
class Scene;
class SceneManager;
class ScreenLog;
@@ -63,12 +63,6 @@ struct ShutdownInformation;
class RenderEngine : public properties::PropertyOwner {
public:
enum class RendererImplementation {
Framebuffer = 0,
ABuffer,
Invalid
};
RenderEngine();
~RenderEngine();
@@ -80,9 +74,6 @@ public:
Scene* scene();
void updateScene();
const Renderer& renderer() const;
RendererImplementation rendererImplementation() const;
ghoul::opengl::OpenGLStateCache& openglStateCache();
void updateShaderPrograms();
@@ -120,24 +111,11 @@ public:
void removeRenderProgram(ghoul::opengl::ProgramObject* program);
/**
* Set raycasting uniforms on the program object, and setup raycasting.
*/
void preRaycast(ghoul::opengl::ProgramObject& programObject);
/**
* Tear down raycasting for the specified program object.
*/
void postRaycast(ghoul::opengl::ProgramObject& programObject);
/**
* Set the camera to use for rendering
*/
void setCamera(Camera* camera);
void setRendererFromString(const std::string& renderingMethod);
/**
* Lets the renderer update the data to be brought into the rendererer programs
* as a 'rendererData' variable in the dictionary.
@@ -176,9 +154,6 @@ public:
uint64_t frameNumber() const;
private:
void setRenderer(std::unique_ptr<Renderer> renderer);
RendererImplementation rendererFromString(const std::string& renderingMethod) const;
void renderScreenLog();
void renderVersionInformation();
void renderCameraInformation();
@@ -188,13 +163,12 @@ private:
Camera* _camera = nullptr;
Scene* _scene = nullptr;
std::unique_ptr<Renderer> _renderer;
RendererImplementation _rendererImplementation = RendererImplementation::Invalid;
FramebufferRenderer _renderer;
ghoul::Dictionary _rendererData;
ghoul::Dictionary _resolveData;
ScreenLog* _log = nullptr;
ghoul::opengl::OpenGLStateCache* _openglStateCache;
ghoul::opengl::OpenGLStateCache* _openglStateCache = nullptr;
properties::BoolProperty _showOverlayOnSlaves;
properties::BoolProperty _showLog;
@@ -233,8 +207,9 @@ private:
std::vector<ghoul::opengl::ProgramObject*> _programs;
std::shared_ptr<ghoul::fontrendering::Font> _fontFrameInfo;
std::shared_ptr<ghoul::fontrendering::Font> _fontInfo;
std::shared_ptr<ghoul::fontrendering::Font> _fontDate;
std::shared_ptr<ghoul::fontrendering::Font> _fontCameraInfo;
std::shared_ptr<ghoul::fontrendering::Font> _fontVersionInfo;
std::shared_ptr<ghoul::fontrendering::Font> _fontShutdown;
std::shared_ptr<ghoul::fontrendering::Font> _fontLog;
struct {

View File

@@ -238,7 +238,7 @@ private:
* Update dependencies.
*/
void updateNodeRegistry();
std::chrono::steady_clock::time_point currentTimeForInterpolation();
void sortTopologically();
std::unique_ptr<Camera> _camera;

View File

@@ -31,9 +31,10 @@
#include <openspace/scripting/lualibrary.h>
#include <ghoul/lua/luastate.h>
#include <ghoul/misc/boolean.h>
#include <filesystem>
#include <mutex>
#include <queue>
#include <optional>
#include <queue>
#include <functional>
namespace openspace { class SyncBuffer; }
@@ -82,7 +83,7 @@ public:
bool hasLibrary(const std::string& name);
bool runScript(const std::string& script, ScriptCallback callback = ScriptCallback());
bool runScriptFile(const std::string& filename);
bool runScriptFile(const std::filesystem::path& filename);
bool writeLog(const std::string& script);
@@ -125,7 +126,6 @@ private:
// Logging variables
bool _logFileExists = false;
bool _logScripts = true;
std::string _logType;
std::string _logFilename;
};

View File

@@ -36,7 +36,7 @@ template<typename P>
void ConcurrentJobManager<P>::enqueueJob(std::shared_ptr<Job<P>> job) {
threadPool.enqueue([this, job]() {
job->execute();
std::lock_guard<std::mutex> lock(_finishedJobsMutex);
std::lock_guard lock(_finishedJobsMutex);
_finishedJobs.push(job);
});
}
@@ -50,7 +50,7 @@ template<typename P>
std::shared_ptr<Job<P>> ConcurrentJobManager<P>::popFinishedJob() {
ghoul_assert(!_finishedJobs.empty(), "There is no finished job to pop!");
std::lock_guard<std::mutex> lock(_finishedJobsMutex);
std::lock_guard lock(_finishedJobsMutex);
return _finishedJobs.pop();
}

View File

@@ -26,18 +26,45 @@
#define __OPENSPACE_CORE___COORDINATECONVERSION___H__
#include <ghoul/glm.h>
#include <string>
namespace openspace {
/**
* Converts from ICRS coordinates to galactic cartesian coordinates.
* Converts from ICRS decimal degrees coordinates to galactic cartesian coordinates.
* \param ra Right ascension, given in decimal degrees
* \param dec Declination, given in decimal degrees
* \param distance The distance, or radius, to the position given in any unit.
* \return A position in galactic cartesian coordinates, given in the same unit as the
* distance parameter.
*/
glm::dvec3 icrsToGalacticCartesian(float ra, float dec, double distance);
glm::dvec3 icrsToGalacticCartesian(double ra, double dec, double distance);
/**
* Converts from ICRS (hms and dms) coordinates to decimal degrees.
* \param ra Right ascension, given as a string in format 'XhYmZs'
* \param dec Declination, given as a string in format 'XdYmZs'
* \return The decimal degrees coordinate in degrees
*/
glm::dvec2 icrsToDecimalDegrees(const std::string& ra, const std::string& dec);
/**
* Converts from galactic cartesian coordinates to ICRS decimal degrees coordinates
* and distance.
* \param x X coordinate
* \param y Y coordinate
* \param z Z coordinate
* \return A vector with the ra and dec decimal degrees in degrees and distance.
*/
glm::dvec3 galacticCartesianToIcrs(double x, double y, double z);
/**
* Converts from ICRS decimal degrees coordinates to ICRS hms and dms coordinates.
* \param ra Right ascension, given in decimal degrees
* \param dec Declination, given in decimal degrees
* \return A pair with the ra and dec strings in hms and dms format.
*/
std::pair<std::string, std::string> decimalDegreesToIcrs(double ra, double dec);
} // namespace openspace

View File

@@ -32,16 +32,16 @@
namespace openspace {
/**
* Base class for keyframes
*/
* Base class for keyframes
*/
struct KeyframeBase {
size_t id;
double timestamp;
};
/**
* Templated class for keyframes containing data
*/
* Templated class for keyframes containing data
*/
template <typename T>
struct Keyframe : public KeyframeBase {
Keyframe(size_t i, double t, T d);
@@ -54,8 +54,8 @@ struct Keyframe : public KeyframeBase {
};
/**
* Templated class for timelines
*/
* Templated class for timelines
*/
template <typename T>
class Timeline {
public:
@@ -81,20 +81,30 @@ private:
};
/**
* Return true if the timestamp of a is smaller the timestamp of b.
*/
* Return true if the timestamp of a is smaller the timestamp of b.
*/
bool compareKeyframeTimes(const KeyframeBase& a, const KeyframeBase& b);
/**
* Return true if a is smaller than the timestamp of b.
*/
* Return true if a is smaller than the timestamp of b.
*/
bool compareTimeWithKeyframeTime(double a, const KeyframeBase& b);
/**
* Return true if the timestamp of a is smaller than b.
*/
* Return true if the timestamp of a is smaller than b.
*/
bool compareKeyframeTimeWithTime(const KeyframeBase& a, double b);
/**
* Return true if the timestamp of a is smaller than or equal to b.
* This is used only in the mode of saving render frames during session recording
* playback. This was necessary to correct a small timing issue caused by fixing
* the application time according to the playback framerate. In normal operation,
* the application time at the instant the keyframes are evaluated is always a
* little bit newer than the first keyframe in the timeline.
*/
bool compareKeyframeTimeWithTime_playbackWithFrames(const KeyframeBase& a, double b);
} // namespace openspace
#include "timeline.inl"

View File

@@ -121,12 +121,14 @@ public:
private:
void progressTime(double dt);
void applyKeyframeData(const TimeKeyframeData& keyframe);
void applyKeyframeData(const TimeKeyframeData& keyframe, double dt);
TimeKeyframeData interpolate(const Keyframe<TimeKeyframeData>& past,
const Keyframe<TimeKeyframeData>& future, double time);
void addDeltaTimesKeybindings();
void clearDeltaTimesKeybindings();
double currentApplicationTimeForInterpolation() const;
double previousApplicationTimeForInterpolation() const;
Timeline<TimeKeyframeData> _timeline;
SyncData<Time> _currentTime;
@@ -139,6 +141,7 @@ private:
bool _lastTimePaused = false;
double _lastDeltaTime = 0.0;
double _lastTargetDeltaTime = 0.0;
double _previousApplicationTime = 0.0;
bool _deltaTimeStepsChanged = false;
std::vector<double> _deltaTimeSteps;

File diff suppressed because it is too large Load Diff

View File

@@ -57,16 +57,17 @@ struct ShadowRenderingStruct {
class AtmosphereDeferredcaster : public Deferredcaster {
public:
AtmosphereDeferredcaster(float textureScale,
std::vector<ShadowConfiguration> shadowConfigArray, bool saveCalculatedTextures);
virtual ~AtmosphereDeferredcaster() = default;
void initialize();
void deinitialize();
void preRaycast(const RenderData& renderData, const DeferredcastData& deferredData,
void preRaycast(const RenderData& data, const DeferredcastData& deferredData,
ghoul::opengl::ProgramObject& program) override;
void postRaycast(const RenderData& renderData, const DeferredcastData& deferredData,
void postRaycast(const RenderData& data, const DeferredcastData& deferredData,
ghoul::opengl::ProgramObject& program) override;
std::filesystem::path deferredcastPath() const override;
std::filesystem::path deferredcastVSPath() const override;
std::filesystem::path deferredcastFSPath() const override;
std::filesystem::path helperPath() const override;
@@ -75,82 +76,64 @@ public:
void update(const UpdateData&) override;
void preCalculateAtmosphereParam();
void calculateAtmosphereParameters();
void setModelTransform(glm::dmat4 transform);
void setTime(double time);
void setAtmosphereRadius(float atmRadius);
void setPlanetRadius(float planetRadius);
void setPlanetAverageGroundReflectance(float averageGReflectance);
void setPlanetGroundRadianceEmission(float groundRadianceEmission);
void setRayleighHeightScale(float rayleighHeightScale);
void enableOzone(bool enable);
void setOzoneHeightScale(float ozoneHeightScale);
void setMieHeightScale(float mieHeightScale);
void setMiePhaseConstant(float miePhaseConstant);
void setSunRadianceIntensity(float sunRadiance);
void setRayleighScatteringCoefficients(glm::vec3 rayScattCoeff);
void setOzoneExtinctionCoefficients(glm::vec3 ozoneExtCoeff);
void setMieScatteringCoefficients(glm::vec3 mieScattCoeff);
void setMieExtinctionCoefficients(glm::vec3 mieExtCoeff);
void setEllipsoidRadii(glm::dvec3 radii);
void setShadowConfigArray(std::vector<ShadowConfiguration> shadowConfigArray);
void setHardShadows(bool enabled);
void enableSunFollowing(bool enable);
void setPrecalculationTextureScale(float preCalculatedTexturesScale);
void enablePrecalculationTexturesSaving();
void setParameters(float atmosphereRadius, float planetRadius,
float averageGroundReflectance, float groundRadianceEmission,
float rayleighHeightScale, bool enableOzone, float ozoneHeightScale,
float mieHeightScale, float miePhaseConstant, float sunRadiance,
glm::vec3 rayScatteringCoefficients, glm::vec3 ozoneExtinctionCoefficients,
glm::vec3 mieScatteringCoefficients, glm::vec3 mieExtinctionCoefficients,
bool sunFollowing);
void setHardShadows(bool enabled);
private:
void loadComputationPrograms();
void unloadComputationPrograms();
void createComputationTextures();
void deleteComputationTextures();
void deleteUnusedComputationTextures();
void executeCalculations(GLuint quadCalcVAO, GLenum drawBuffers[1],
GLsizei vertexSize);
void step3DTexture(ghoul::opengl::ProgramObject& shaderProg, int layer,
bool doCalculation);
void loadAtmosphereDataIntoShaderProgram(ghoul::opengl::ProgramObject& shaderProg);
void step3DTexture(ghoul::opengl::ProgramObject& prg, int layer);
void calculateTransmittance();
GLuint calculateDeltaE();
std::pair<GLuint, GLuint> calculateDeltaS();
void calculateIrradiance();
void calculateInscattering(GLuint deltaSRayleigh, GLuint deltaSMie);
void calculateDeltaJ(int scatteringOrder,
ghoul::opengl::ProgramObject& program, GLuint deltaJ, GLuint deltaE,
GLuint deltaSRayleigh, GLuint deltaSMie);
void calculateDeltaE(int scatteringOrder,
ghoul::opengl::ProgramObject& program, GLuint deltaE, GLuint deltaSRayleigh,
GLuint deltaSMie);
void calculateDeltaS(int scatteringOrder,
ghoul::opengl::ProgramObject& program, GLuint deltaSRayleigh, GLuint deltaJ);
void calculateIrradiance(int scatteringOrder,
ghoul::opengl::ProgramObject& program, GLuint deltaE);
void calculateInscattering(int scatteringOrder,
ghoul::opengl::ProgramObject& program, GLuint deltaSRayleigh);
std::unique_ptr<ghoul::opengl::ProgramObject> _transmittanceProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _irradianceProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _irradianceSupTermsProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _irradianceFinalProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _inScatteringProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _inScatteringSupTermsProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _deltaEProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _deltaSProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _deltaSSupTermsProgramObject;
std::unique_ptr<ghoul::opengl::ProgramObject> _deltaJProgramObject;
UniformCache(cullAtmosphere, Rg, Rt, groundRadianceEmission, HR, betaRayleigh, HM,
betaMieExtinction, mieG, sunRadiance, ozoneLayerEnabled, HO, betaOzoneExtinction,
SAMPLES_R, SAMPLES_MU, SAMPLES_MU_S, SAMPLES_NU, dInverseModelTransformMatrix,
dModelTransformMatrix, dSgctProjectionToModelTransformMatrix,
dSGCTViewToWorldMatrix, dCamPosObj, sunDirectionObj, hardShadows,
transmittanceTexture, irradianceTexture, inscatterTexture) _uniformCache;
GLuint _transmittanceTableTexture = 0;
GLuint _irradianceTableTexture = 0;
GLuint _inScatteringTableTexture = 0;
GLuint _deltaETableTexture = 0;
GLuint _deltaSRayleighTableTexture = 0;
GLuint _deltaSMieTableTexture = 0;
GLuint _deltaJTableTexture = 0;
SAMPLES_R, SAMPLES_MU, SAMPLES_MU_S, SAMPLES_NU, inverseModelTransformMatrix,
modelTransformMatrix, projectionToModelTransform, viewToWorldMatrix,
camPosObj, sunDirectionObj, hardShadows, transmittanceTexture, irradianceTexture,
inscatterTexture) _uniformCache;
ghoul::opengl::TextureUnit _transmittanceTableTextureUnit;
ghoul::opengl::TextureUnit _irradianceTableTextureUnit;
ghoul::opengl::TextureUnit _inScatteringTableTextureUnit;
GLuint _transmittanceTableTexture = 0;
GLuint _irradianceTableTexture = 0;
GLuint _inScatteringTableTexture = 0;
// Atmosphere Data
bool _atmosphereCalculated = false;
bool _ozoneEnabled = false;
bool _sunFollowingCameraEnabled = false;
float _atmosphereRadius = 0.f;
float _atmospherePlanetRadius = 0.f;
float _planetAverageGroundReflectance = 0.f;
float _planetGroundRadianceEmission = 0.f;
float _averageGroundReflectance = 0.f;
float _groundRadianceEmission = 0.f;
float _rayleighHeightScale = 0.f;
float _ozoneHeightScale = 0.f;
float _mieHeightScale = 0.f;
@@ -161,34 +144,33 @@ private:
glm::vec3 _ozoneExtinctionCoeff = glm::vec3(0.f);
glm::vec3 _mieScatteringCoeff = glm::vec3(0.f);
glm::vec3 _mieExtinctionCoeff = glm::vec3(0.f);
glm::dvec3 _ellipsoidRadii = glm::vec3(0.f);
// Atmosphere Textures Dimmensions
glm::ivec2 _transmittanceTableSize = glm::ivec2(256, 64);
glm::ivec2 _irradianceTableSize = glm::ivec2(64, 16);
glm::ivec2 _deltaETableSize = glm::ivec2(64, 16);
int _r_samples = 32;
int _mu_samples = 128;
int _mu_s_samples = 32;
int _nu_samples = 8;
const glm::ivec2 _transmittanceTableSize;
const glm::ivec2 _irradianceTableSize;
const glm::ivec2 _deltaETableSize;
const int _muSSamples;
const int _nuSamples;
const int _muSamples;
const int _rSamples;
const glm::ivec3 _textureSize;
glm::dmat4 _modelTransform;
double _time = 0.0;
// Eclipse Shadows
std::vector<ShadowConfiguration> _shadowConfArray;
std::vector<ShadowRenderingStruct> _shadowDataArrayCache;
bool _hardShadowsEnabled = false;
// Atmosphere Debugging
bool _saveCalculationTextures = false;
const bool _saveCalculationTextures = false;
std::vector<ShadowRenderingStruct> _shadowDataArrayCache;
// Assuming < 1000 shadow casters, the longest uniform name that we are getting is
// shadowDataArray[999].casterPositionVec
// which needs to fit into the uniform buffer
char _uniformNameBuffer[40];
};
} // openspace
} // namespace openspace
#endif // __OPENSPACE_MODULE_ATMOSPHERE___ATMOSPHEREDEFERREDCASTER___H__

View File

@@ -244,7 +244,7 @@ RenderableAtmosphere::RenderableAtmosphere(const ghoul::Dictionary& dictionary)
MieScatteringCoeffInfo,
glm::vec3(0.004f), glm::vec3(0.00001f), glm::vec3(1.f)
)
, _mieScatteringExtinctionPropCoefficient(
, _mieScatteringExtinctionPropCoeff(
MieScatteringExtinctionPropCoeffInfo,
0.9f, 0.01f, 1.f
)
@@ -328,16 +328,15 @@ RenderableAtmosphere::RenderableAtmosphere(const ghoul::Dictionary& dictionary)
_miePhaseConstant.onChange(updateWithCalculation);
addProperty(_miePhaseConstant);
_mieScatteringExtinctionPropCoefficient =
_mieScatteringExtinctionPropCoeff =
_mieScattExtPropCoefProp != 1.f ? _mieScattExtPropCoefProp :
_mieScatteringCoeff.value().x / _mieExtinctionCoeff.x;
_mieScatteringExtinctionPropCoefficient.onChange(updateWithCalculation);
addProperty(_mieScatteringExtinctionPropCoefficient);
_mieScatteringExtinctionPropCoeff.onChange(updateWithCalculation);
addProperty(_mieScatteringExtinctionPropCoeff);
if (p.debug.has_value()) {
_preCalculatedTexturesScale =
p.debug->preCalculatedTextureScale.value_or(_preCalculatedTexturesScale);
_textureScale = p.debug->preCalculatedTextureScale.value_or(_textureScale);
_saveCalculationsToTexture =
p.debug->saveCalculatedTextures.value_or(_saveCalculationsToTexture);
@@ -364,15 +363,13 @@ void RenderableAtmosphere::deinitializeGL() {
}
void RenderableAtmosphere::initializeGL() {
_deferredcaster = std::make_unique<AtmosphereDeferredcaster>();
_deferredcaster = std::make_unique<AtmosphereDeferredcaster>(
_textureScale,
_shadowEnabled ? std::move(_shadowConfArray) : std::vector<ShadowConfiguration>(),
_saveCalculationsToTexture
);
_shadowConfArray.clear();
updateAtmosphereParameters();
if (_shadowEnabled) {
_deferredcaster->setShadowConfigArray(_shadowConfArray);
// We no longer need it
_shadowConfArray.clear();
}
_deferredcaster->initialize();
global::deferredcasterManager->attachDeferredcaster(*_deferredcaster);
@@ -382,13 +379,11 @@ bool RenderableAtmosphere::isReady() const {
return true;
}
glm::dmat4 RenderableAtmosphere::computeModelTransformMatrix(
const TransformData& transformData)
{
glm::dmat4 RenderableAtmosphere::computeModelTransformMatrix(const TransformData& data) {
// scale the planet to appropriate size since the planet is a unit sphere
return glm::translate(glm::dmat4(1.0), transformData.translation) *
glm::dmat4(transformData.rotation) *
glm::scale(glm::dmat4(1.0), glm::dvec3(transformData.scale));
return glm::translate(glm::dmat4(1.0), data.translation) *
glm::dmat4(data.rotation) *
glm::scale(glm::dmat4(1.0), glm::dvec3(data.scale));
}
void RenderableAtmosphere::render(const RenderData& data, RendererTasks& renderTask) {
@@ -404,11 +399,10 @@ void RenderableAtmosphere::update(const UpdateData& data) {
_deferredCasterNeedsUpdate = false;
}
if (_deferredCasterNeedsCalculation) {
_deferredcaster->preCalculateAtmosphereParam();
_deferredcaster->calculateAtmosphereParameters();
_deferredCasterNeedsCalculation = false;
}
_deferredcaster->setTime(data.time.j2000Seconds());
glm::dmat4 modelTransform = computeModelTransformMatrix(data.modelTransform);
_deferredcaster->setModelTransform(modelTransform);
_deferredcaster->update(data);
@@ -416,34 +410,26 @@ void RenderableAtmosphere::update(const UpdateData& data) {
void RenderableAtmosphere::updateAtmosphereParameters() {
_mieExtinctionCoeff =
_mieScatteringCoeff.value() / _mieScatteringExtinctionPropCoefficient.value();
_mieScatteringCoeff.value() / _mieScatteringExtinctionPropCoeff.value();
_deferredcaster->setAtmosphereRadius(_planetRadius + _atmosphereHeight);
_deferredcaster->setPlanetRadius(_planetRadius);
_deferredcaster->setPlanetAverageGroundReflectance(_groundAverageReflectance);
_deferredcaster->setPlanetGroundRadianceEmission(_groundRadianceEmission);
_deferredcaster->setRayleighHeightScale(_rayleighHeightScale);
_deferredcaster->enableOzone(_ozoneEnabled);
_deferredcaster->setOzoneHeightScale(_ozoneHeightScale);
_deferredcaster->setMieHeightScale(_mieHeightScale);
_deferredcaster->setMiePhaseConstant(_miePhaseConstant);
_deferredcaster->setSunRadianceIntensity(_sunIntensity);
_deferredcaster->setRayleighScatteringCoefficients(_rayleighScatteringCoeff);
_deferredcaster->setOzoneExtinctionCoefficients(_ozoneCoeff);
_deferredcaster->setMieScatteringCoefficients(_mieScatteringCoeff);
_deferredcaster->setMieExtinctionCoefficients(_mieExtinctionCoeff);
_deferredcaster->enableSunFollowing(_sunFollowingCameraEnabled);
// TODO: Fix the ellipsoid nature of the renderable globe (JCC)
//_deferredcaster->setEllipsoidRadii(_ellipsoid.radii());
_deferredcaster->setPrecalculationTextureScale(_preCalculatedTexturesScale);
if (_saveCalculationsToTexture) {
_deferredcaster->enablePrecalculationTexturesSaving();
}
if (_shadowEnabled) {
_deferredcaster->setHardShadows(_hardShadowsEnabled);
}
_deferredcaster->setParameters(
_planetRadius + _atmosphereHeight,
_planetRadius,
_groundAverageReflectance,
_groundRadianceEmission,
_rayleighHeightScale,
_ozoneEnabled,
_ozoneHeightScale,
_mieHeightScale,
_miePhaseConstant,
_sunIntensity,
_rayleighScatteringCoeff,
_ozoneCoeff,
_mieScatteringCoeff,
_mieExtinctionCoeff,
_sunFollowingCameraEnabled
);
_deferredcaster->setHardShadows(_hardShadowsEnabled);
}
} // namespace openspace

View File

@@ -72,7 +72,7 @@ public:
static documentation::Documentation Documentation();
private:
glm::dmat4 computeModelTransformMatrix(const openspace::TransformData& transformData);
glm::dmat4 computeModelTransformMatrix(const openspace::TransformData& data);
void updateAtmosphereParameters();
properties::FloatProperty _atmosphereHeight;
@@ -85,7 +85,7 @@ private:
properties::Vec3Property _ozoneCoeff;
properties::FloatProperty _mieHeightScale;
properties::Vec3Property _mieScatteringCoeff;
properties::FloatProperty _mieScatteringExtinctionPropCoefficient;
properties::FloatProperty _mieScatteringExtinctionPropCoeff;
properties::FloatProperty _miePhaseConstant;
properties::FloatProperty _sunIntensity;
properties::BoolProperty _sunFollowingCameraEnabled;
@@ -98,7 +98,7 @@ private:
// Atmosphere Debug
bool _saveCalculationsToTexture = false;
float _preCalculatedTexturesScale = 1.f;
float _textureScale = 1.f;
std::unique_ptr<AtmosphereDeferredcaster> _deferredcaster;

View File

@@ -54,85 +54,11 @@
* THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
// Atmosphere Rendering Parameters
uniform float Rg;
uniform float Rt;
uniform float AverageGroundReflectance;
uniform float groundRadianceEmission;
uniform float HR;
uniform vec3 betaRayleigh;
uniform float HO;
uniform vec3 betaOzoneExtinction;
uniform float HM;
uniform vec3 betaMieScattering;
uniform vec3 betaMieExtinction;
uniform float mieG;
uniform float sunRadiance;
uniform bool ozoneLayerEnabled;
uniform ivec2 TRANSMITTANCE;
uniform ivec2 SKY;
uniform ivec2 OTHER_TEXTURES;
uniform int SAMPLES_R;
uniform int SAMPLES_MU;
uniform int SAMPLES_MU_S;
uniform int SAMPLES_NU;
const int INSCATTER_INTEGRAL_SAMPLES = 50;
const float M_PI = 3.141592657;
const float ATM_EPSILON = 1.0;
// Integration steps
const int TRANSMITTANCE_STEPS = 500;
const int INSCATTER_INTEGRAL_SAMPLES = 50;
const int IRRADIANCE_INTEGRAL_SAMPLES = 32;
const int INSCATTER_SPHERICAL_INTEGRAL_SAMPLES = 16;
const float M_PI = 3.141592657;
const float M_2PI = 2.0 * M_PI;
uniform sampler2D transmittanceTexture;
float Rg2 = Rg * Rg;
float Rt2 = Rt * Rt;
float H = sqrt(Rt2 - Rg2);
float H2 = Rt2 - Rg2;
float invSamplesMu = 1.0 / float(SAMPLES_MU);
float invSamplesR = 1.0 / float(SAMPLES_R);
float invSamplesMuS = 1.0 / float(SAMPLES_MU_S);
float invSamplesNu = 1.0 / float(SAMPLES_NU);
float RtMinusRg = float(Rt - Rg);
float invRtMinusRg = 1.0 / RtMinusRg;
float opticalDepth(float localH, float r, float mu, float d) {
float invH = 1.0 / localH;
float a = sqrt(0.5 * invH * r);
vec2 a01 = a * vec2(mu, mu + d / r);
vec2 a01s = sign(a01);
vec2 a01sq = a01 * a01;
float x = a01s.y > a01s.x ? exp(a01sq.x) : 0.0;
vec2 y = a01s / (2.3193 * abs(a01) + sqrt(1.52 * a01sq + 4.0)) * vec2(1.0, exp(-d * invH * (d / (2.0 * r) + mu)));
return sqrt(M_2PI * H * r) * exp((Rg-r)*invH) * (x + dot(y, vec2(1.0, -1.0)));
}
vec3 analyticTransmittance(float r, float mu, float d) {
vec3 ozone = vec3(0.0);
if (ozoneLayerEnabled) {
ozone = betaOzoneExtinction * (0.0000006) * opticalDepth(HO, r, mu, d);
}
return exp(-betaRayleigh * opticalDepth(HR, r, mu, d) - ozone -
betaMieExtinction * opticalDepth(HM, r, mu, d));
}
vec3 irradiance(sampler2D sampler, float r, float muSun) {
float u_r = (r - Rg) * invRtMinusRg;
float u_muSun = (muSun + 0.2) / 1.2;
return texture(sampler, vec2(u_muSun, u_r)).rgb;
}
//================================================//
//=============== General Functions ==============//
//================================================//
// In the following shaders r (altitude) is the length of vector/position x in the
// atmosphere (or on the top of it when considering an observer in space), where the light
// is coming from the opposite direction of the view direction, here the vector v or
@@ -142,7 +68,7 @@ vec3 irradiance(sampler2D sampler, float r, float muSun) {
// or top of atmosphere
// r := || vec(x) || e [0, Rt]
// mu := cosine of the zeith angle of vec(v). Or mu = (vec(x) * vec(v))/r
float rayDistance(float r, float mu) {
float rayDistance(float r, float mu, float Rt, float Rg) {
// The light ray starting at the observer in/on the atmosphere can have to possible end
// points: the top of the atmosphere or the planet ground. So the shortest path is the
// one we are looking for, otherwise we may be passing through the ground
@@ -151,9 +77,8 @@ float rayDistance(float r, float mu) {
float atmRadiusEps2 = (Rt + ATM_EPSILON) * (Rt + ATM_EPSILON);
float mu2 = mu * mu;
float r2 = r * r;
float rg2 = Rg * Rg;
float rayDistanceAtmosphere = -r * mu + sqrt(r2 * (mu2 - 1.0) + atmRadiusEps2);
float delta = r2 * (mu2 - 1.0) + rg2;
float delta = r2 * (mu2 - 1.0) + Rg*Rg;
// Ray may be hitting ground
if (delta >= 0.0) {
@@ -173,19 +98,23 @@ float rayDistance(float r, float mu) {
// nu := cosone of the angle between vec(s) and vec(v)
// dhdH := it is a vec4. dhdH.x stores the dminT := Rt - r, dhdH.y stores the dH value
// (see paper), dhdH.z stores dminG := r - Rg and dhdH.w stores dh (see paper)
void unmappingMuMuSunNu(float r, vec4 dhdH, out float mu, out float muSun, out float nu) {
void unmappingMuMuSunNu(float r, vec4 dhdH, int SAMPLES_MU, float Rg, float Rt,
int SAMPLES_MU_S, int SAMPLES_NU,
out float mu, out float muSun, out float nu)
{
// Window coordinates of pixel (uncentering also)
vec2 fragment = gl_FragCoord.xy - vec2(0.5);
// Pre-calculations
float r2 = r * r;
float Rg2 = Rg * Rg;
float halfSAMPLE_MU = float(SAMPLES_MU) / 2.0;
// If the (vec(x) dot vec(v))/r is negative, i.e., the light ray has great probability
// to touch the ground, we obtain mu considering the geometry of the ground
if (fragment.y < halfSAMPLE_MU) {
float ud = 1.0 - (fragment.y / (halfSAMPLE_MU - 1.0));
float d = min(max(dhdH.z, ud * dhdH.w), dhdH.w * 0.999);
float d = min(max(dhdH.z, ud * dhdH.w), dhdH.w * 0.999);
// cosine law: Rg^2 = r^2 + d^2 - 2rdcos(pi-theta) where cosine(theta) = mu
mu = (Rg2 - r2 - d * d) / (2.0 * r * d);
// We can't handle a ray inside the planet, i.e., when r ~ Rg, so we check against it.
@@ -199,12 +128,12 @@ void unmappingMuMuSunNu(float r, vec4 dhdH, out float mu, out float muSun, out f
float d = (fragment.y - halfSAMPLE_MU) / (halfSAMPLE_MU - 1.0);
d = min(max(dhdH.x, d * dhdH.y), dhdH.y * 0.999);
// cosine law: Rt^2 = r^2 + d^2 - 2rdcos(pi-theta) where cosine(theta) = mu
mu = (Rt2 - r2 - d * d) / (2.0 * r * d);
mu = (Rt*Rt - r2 - d * d) / (2.0 * r * d);
}
float modValueMuSun = mod(fragment.x, float(SAMPLES_MU_S)) / (float(SAMPLES_MU_S) - 1.0);
// The following mapping is different from the paper. See Colliene for an details.
muSun = tan((2.0 * modValueMuSun - 1.0 + 0.26) * 1.1f) / tan(1.26 * 1.1);
// The following mapping is different from the paper. See Collienne for an details.
muSun = tan((2.0 * modValueMuSun - 1.0 + 0.26) * 1.1) / tan(1.26 * 1.1);
nu = -1.0 + floor(fragment.x / float(SAMPLES_MU_S)) / (float(SAMPLES_NU) - 1.0) * 2.0;
}
@@ -213,14 +142,14 @@ void unmappingMuMuSunNu(float r, vec4 dhdH, out float mu, out float muSun, out f
// hits the ground or the top of atmosphere.
// r := height of starting point vect(x)
// mu := cosine of the zeith angle of vec(v). Or mu = (vec(x) * vec(v))/r
vec3 transmittance(float r, float mu) {
vec3 transmittance(sampler2D tex, float r, float mu, float Rg, float Rt) {
// Given the position x (here the altitude r) and the view angle v
// (here the cosine(v)= mu), we map this
float u_r = sqrt((r - Rg) * invRtMinusRg);
// See Colliene to understand the mapping
float u_r = sqrt((r - Rg) / (Rt - Rg));
// See Collienne to understand the mapping
float u_mu = atan((mu + 0.15) / 1.15 * tan(1.5)) / 1.5;
return texture(transmittanceTexture, vec2(u_mu, u_r)).rgb;
return texture(tex, vec2(u_mu, u_r)).rgb;
}
// Given a position r and direction mu, calculates de transmittance along the ray with
@@ -228,13 +157,13 @@ vec3 transmittance(float r, float mu) {
// T(a,b) = TableT(a,v)/TableT(b, v)
// r := height of starting point vect(x)
// mu := cosine of the zeith angle of vec(v). Or mu = (vec(x) * vec(v))/r
vec3 transmittance(float r, float mu, float d) {
vec3 transmittance(sampler2D tex, float r, float mu, float d, float Rg, float Rt) {
// Here we use the transmittance property: T(x,v) = T(x,d)*T(d,v) to, given a distance
// d, calculates that transmittance along that distance starting in x (height r):
// T(x,d) = T(x,v)/T(d,v).
//
// From cosine law: c^2 = a^2 + b^2 - 2*a*b*cos(ab)
float ri = sqrt(d * d + r * r + 2.0 * r * d * mu);
float ri = sqrt(d * d + r * r + 2.0 * r * d * mu);
// mu_i = (vec(d) dot vec(v)) / r_i
// = ((vec(x) + vec(d-x)) dot vec(v))/ r_i
// = (r*mu + d) / r_i
@@ -246,19 +175,21 @@ vec3 transmittance(float r, float mu, float d) {
// x --> x0, then x0-->x.
// Also, let's use the property: T(a,c) = T(a,b)*T(b,c)
// Because T(a,c) and T(b,c) are already in the table T, T(a,b) = T(a,c)/T(b,c).
vec3 res;
if (mu > 0.0) {
return min(transmittance(r, mu) / transmittance(ri, mui), 1.0);
res = transmittance(tex, r, mu, Rg, Rt) / transmittance(tex, ri, mui, Rg, Rt);
}
else {
return min(transmittance(ri, -mui) / transmittance(r, -mu), 1.0);
res = transmittance(tex, ri, -mui, Rg, Rt) / transmittance(tex, r, -mu, Rg, Rt);
}
return min(res, 1.0);
}
// Calculates Rayleigh phase function given the scattering cosine angle mu
// mu := cosine of the zeith angle of vec(v). Or mu = (vec(x) * vec(v))/r
float rayleighPhaseFunction(float mu) {
//return (3.0f / (16.0f * M_PI)) * (1.0f + mu * mu);
return 0.0596831036 * (1.0 + mu * mu);
// return (3.0 / (16.0 * M_PI)) * (1.0 + mu * mu);
return 0.0596831036 * (1.0 + mu * mu);
}
// Calculates Mie phase function given the scattering cosine angle mu
@@ -277,38 +208,30 @@ float miePhaseFunction(float mu, float mieG) {
// mu := cosine of the zeith angle of vec(v). Or mu = (vec(x) * vec(v))/r
// muSun := cosine of the zeith angle of vec(s). Or muSun = (vec(s) * vec(v))
// nu := cosine of the angle between vec(s) and vec(v)
vec4 texture4D(sampler3D table, float r, float mu, float muSun, float nu) {
vec4 texture4D(sampler3D table, float r, float mu, float muSun, float nu, float Rg,
int samplesMu, float Rt, int samplesR, int samplesMuS,
int samplesNu)
{
float r2 = r * r;
float Rg2 = Rg * Rg;
float Rt2 = Rt * Rt;
float rho = sqrt(r2 - Rg2);
float rmu = r * mu;
float delta = rmu * rmu - r2 + Rg2;
vec4 cst = rmu < 0.0 && delta > 0.0 ?
vec4(1.0, 0.0, 0.0, 0.5 - 0.5 * invSamplesMu) :
vec4(-1.0, H2, H, 0.5 + 0.5 * invSamplesMu);
vec4(1.0, 0.0, 0.0, 0.5 - 0.5 / float(samplesMu)) :
vec4(-1.0, Rt2 - Rg2, sqrt(Rt2 - Rg2), 0.5 + 0.5 / float(samplesMu));
float u_r = 0.5 * invSamplesR + rho / H * (1.0 - invSamplesR);
float u_mu = cst.w + (rmu * cst.x + sqrt(delta + cst.y)) / (rho + cst.z) * (0.5 - invSamplesMu);
float u_mu_s = 0.5 * invSamplesMuS +
(atan(max(muSun, -0.1975) * tan(1.386)) * 0.9090909090909090 + 0.74) * 0.5 * (1.0 - invSamplesMuS);
float lerp = (nu + 1.0) / 2.0 * (float(SAMPLES_NU) - 1.0);
float u_nu = floor(lerp);
lerp = lerp - u_nu;
float u_r = 0.5 / float(samplesR) + rho / sqrt(Rt2 - Rg2) * (1.0 - 1.0 / float(samplesR));
float u_mu = cst.w + (rmu * cst.x + sqrt(delta + cst.y)) / (rho + cst.z) * (0.5 - 1.0 / samplesMu);
float u_mu_s = 0.5 / float(samplesMuS) +
(atan(max(muSun, -0.1975) * tan(1.386)) * 0.9090909090909090 + 0.74) * 0.5 * (1.0 - 1.0 / float(samplesMuS));
float t = (nu + 1.0) / 2.0 * (float(samplesNu) - 1.0);
float u_nu = floor(t);
t = t - u_nu;
return texture(
table, vec3((u_nu + u_mu_s) * invSamplesNu, u_mu, u_r)) * (1.0 - lerp) +
texture(table, vec3((u_nu + u_mu_s + 1.0) * invSamplesNu, u_mu, u_r)) * lerp;
}
// Given the irradiance texture table, the cosine of zenith sun vector and the height of
// the observer (ray's stating point x), calculates the mapping for u_r and u_muSun and
// returns the value in the LUT
// lut := OpenGL texture2D sampler (the irradiance texture deltaE)
// muSun := cosine of the zeith angle of vec(s). Or muSun = (vec(s) * vec(v))
// r := height of starting point vect(x)
vec3 irradianceLUT(sampler2D lut, float muSun, float r) {
// See Bruneton paper and Coliene to understand the mapping
float u_muSun = (muSun + 0.2) / 1.2;
float u_r = (r - Rg) * invRtMinusRg;
return texture(lut, vec2(u_muSun, u_r)).rgb;
vec4 v1 = texture(table, vec3((u_nu + u_mu_s) / float(samplesNu), u_mu, u_r));
vec4 v2 = texture(table, vec3((u_nu + u_mu_s + 1.0) / float(samplesNu), u_mu, u_r));
return mix(v1, v2, t);
}

View File

@@ -62,24 +62,37 @@
in vec2 texCoord;
out vec4 renderTarget;
uniform int nAaSamples;
uniform int cullAtmosphere;
uniform float Rg;
uniform float Rt;
uniform float groundRadianceEmission;
uniform float HR;
uniform vec3 betaRayleigh;
uniform float HO;
uniform vec3 betaOzoneExtinction;
uniform float HM;
uniform vec3 betaMieExtinction;
uniform float mieG;
uniform float sunRadiance;
uniform bool ozoneLayerEnabled;
uniform int SAMPLES_R;
uniform int SAMPLES_MU;
uniform int SAMPLES_MU_S;
uniform int SAMPLES_NU;
uniform sampler2D transmittanceTexture;
uniform sampler2D irradianceTexture;
uniform sampler3D inscatterTexture;
uniform sampler2D mainPositionTexture;
uniform sampler2D mainNormalTexture;
uniform sampler2D mainColorTexture;
uniform dmat4 dInverseModelTransformMatrix;
uniform dmat4 dModelTransformMatrix;
uniform dmat4 dSGCTViewToWorldMatrix;
uniform dmat4 dSgctProjectionToModelTransformMatrix;
uniform dmat4 inverseModelTransformMatrix;
uniform dmat4 modelTransformMatrix;
uniform dmat4 viewToWorldMatrix;
uniform dmat4 projectionToModelTransformMatrix;
uniform vec4 viewport;
uniform vec2 resolution;
uniform dvec4 dCamPosObj;
uniform dvec3 camPosObj;
uniform dvec3 sunDirectionObj;
/*******************************************************************************
@@ -89,8 +102,10 @@ uniform dvec3 sunDirectionObj;
const uint numberOfShadows = 1;
struct ShadowRenderingStruct {
double xu, xp;
double rs, rc;
double xu;
double xp;
double rs;
double rc;
dvec3 sourceCasterVec;
dvec3 casterPositionVec;
bool isShadowing;
@@ -110,15 +125,15 @@ float calcShadow(ShadowRenderingStruct shadowInfoArray[numberOfShadows], dvec3 p
}
dvec3 pc = shadowInfoArray[0].casterPositionVec - position;
dvec3 sc_norm = shadowInfoArray[0].sourceCasterVec;
dvec3 pc_proj = dot(pc, sc_norm) * sc_norm;
dvec3 d = pc - pc_proj;
dvec3 scNorm = shadowInfoArray[0].sourceCasterVec;
dvec3 pcProj = dot(pc, scNorm) * scNorm;
dvec3 d = pc - pcProj;
float length_d = float(length(d));
double length_pc_proj = length(pc_proj);
double lengthPcProj = length(pcProj);
float r_p_pi = float(shadowInfoArray[0].rc * (length_pc_proj + shadowInfoArray[0].xp) / shadowInfoArray[0].xp);
float r_u_pi = float(shadowInfoArray[0].rc * (shadowInfoArray[0].xu - length_pc_proj) / shadowInfoArray[0].xu);
float r_p_pi = float(shadowInfoArray[0].rc * (lengthPcProj + shadowInfoArray[0].xp) / shadowInfoArray[0].xp);
float r_u_pi = float(shadowInfoArray[0].rc * (shadowInfoArray[0].xu - lengthPcProj) / shadowInfoArray[0].xu);
if (length_d < r_u_pi) {
// umbra
@@ -139,6 +154,33 @@ float calcShadow(ShadowRenderingStruct shadowInfoArray[numberOfShadows], dvec3 p
}
}
float opticalDepth(float localH, float r, float mu, float d, float Rg) {
float invH = 1.0 / localH;
float a = sqrt(0.5 * invH * r);
vec2 a01 = a * vec2(mu, mu + d / r);
vec2 a01s = sign(a01);
vec2 a01sq = a01 * a01;
float x = a01s.y > a01s.x ? exp(a01sq.x) : 0.0;
vec2 y = a01s / (2.3193 * abs(a01) + sqrt(1.52 * a01sq + 4.0)) *
vec2(1.0, exp(-d * invH * (d / (2.0 * r) + mu)));
return sqrt(2.0 * M_PI * sqrt(Rt*Rt - Rg*Rg) * r) * exp((Rg-r)*invH) * (x + dot(y, vec2(1.0, -1.0)));
}
vec3 analyticTransmittance(float r, float mu, float d) {
vec3 ozone = vec3(0.0);
if (ozoneLayerEnabled) {
ozone = betaOzoneExtinction * 0.0000006 * opticalDepth(HO, r, mu, d, Rg);
}
return exp(-betaRayleigh * opticalDepth(HR, r, mu, d, Rg) - ozone -
betaMieExtinction * opticalDepth(HM, r, mu, d, Rg));
}
vec3 irradiance(sampler2D s, float r, float muSun) {
float u_r = (r - Rg) / (Rt - Rg);
float u_muSun = (muSun + 0.2) / 1.2;
return texture(s, vec2(u_muSun, u_r)).rgb;
}
//////////////////////////////////////////////////////////////////////////////////////////
// ALL CALCULATIONS FOR ATMOSPHERE ARE KM AND IN WORLD SPACE SYSTEM //
//////////////////////////////////////////////////////////////////////////////////////////
@@ -170,10 +212,11 @@ bool atmosphereIntersection(Ray ray, double atmRadius, out double offset,
double l2 = dot(l, l);
double r2 = atmRadius * atmRadius; // avoiding surface acne
offset = 0.0;
maxLength = 0.0;
// Ray origin (eye position) is behind sphere
if ((s < 0.0) && (l2 > r2)) {
offset = 0.0;
maxLength = 0.0;
return false;
}
@@ -181,13 +224,9 @@ bool atmosphereIntersection(Ray ray, double atmRadius, out double offset,
// Ray misses atmosphere
if (m2 > r2) {
offset = 0.0;
maxLength = 0.0;
return false;
}
// We already now the ray hits the atmosphere
// If q = 0.0, there is only one intersection
double q = sqrt(r2 - m2);
@@ -215,13 +254,13 @@ Ray calculateRayRenderableGlobe(vec2 st) {
dvec4 clipCoords = dvec4(interpolatedNDCPos, 1.0, 1.0);
// Clip to Object Coords
dvec4 objectCoords = dSgctProjectionToModelTransformMatrix * clipCoords;
objectCoords /= objectCoords.w;
dvec4 objectCoords = projectionToModelTransformMatrix * clipCoords;
objectCoords.xyz /= objectCoords.w;
// Building Ray
// Ray in object space (in KM)
Ray ray;
ray.origin = dvec3(dCamPosObj * dvec4(0.001, 0.001, 0.001, 1.0));
ray.origin = camPosObj * 0.001;
ray.direction = normalize(objectCoords.xyz * dvec3(0.001) - ray.origin);
return ray;
}
@@ -244,16 +283,15 @@ Ray calculateRayRenderableGlobe(vec2 st) {
* attenuation := out of transmittance T(x,x0). This will be used later when calculating
* the reflectance R[L]
*/
vec3 inscatterRadiance(vec3 x, inout float t, out float irradianceFactor, vec3 v, vec3 s,
out float r, out float mu, out vec3 attenuation, vec3 fragPosObj,
out bool groundHit, double maxLength, double pixelDepth,
vec4 spaceColor, float sunIntensity)
vec3 inscatterRadiance(vec3 x, inout float t, inout float irradianceFactor, vec3 v, vec3 s,
float r, vec3 fragPosObj, double maxLength, double pixelDepth,
vec3 spaceColor, float sunIntensity,
out float mu, out vec3 attenuation, out bool groundHit)
{
const float INTERPOLATION_EPS = 0.004; // precision const from Brunetton
vec3 radiance;
r = length(x);
mu = dot(x, v) / r;
float r2 = r * r;
@@ -266,7 +304,11 @@ vec3 inscatterRadiance(vec3 x, inout float t, out float irradianceFactor, vec3 v
// I.e. the next line has the scattering light for the "infinite" ray passing through
// the atmosphere. If this ray hits something inside the atmosphere, we will subtract
// the attenuated scattering light from that path in the current path
vec4 inscatterRadiance = max(texture4D(inscatterTexture, r, mu, muSun, nu), 0.0);
vec4 inscatterRadiance = max(
texture4D(inscatterTexture, r, mu, muSun, nu, Rg, SAMPLES_MU, Rt, SAMPLES_R,
SAMPLES_MU_S, SAMPLES_NU),
0.0
);
// After removing the initial path from camera pos to top of atmosphere (for an
// observer in the space) we test if the light ray is hitting the atmosphere
@@ -284,13 +326,14 @@ vec3 inscatterRadiance(vec3 x, inout float t, out float irradianceFactor, vec3 v
// attenuation = analyticTransmittance(r, mu, t);
// JCC: change from analytical to LUT transmittance to avoid
// acme on planet surface when looking from far away. (11/02/2017)
attenuation = transmittance(r, mu, t);
attenuation = transmittance(transmittanceTexture, r, mu, t, Rg, Rt);
// Here we use the idea of S[L](a->b) = S[L](b->a), and get the S[L](x0, v, s)
// Then we calculate S[L] = S[L]|x - T(x, x0)*S[L]|x0
// The "infinite" ray hist something inside the atmosphere, so we need to remove
// the unsused contribution to the final radiance.
vec4 inscatterFromSurface = texture4D(inscatterTexture, r0, mu0, muSun0, nu);
vec4 inscatterFromSurface = texture4D(inscatterTexture, r0, mu0, muSun0, nu, Rg,
SAMPLES_MU, Rt, SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU);
inscatterRadiance = max(inscatterRadiance - attenuation.rgbr * inscatterFromSurface, 0.0);
// We set the irradianceFactor to 1.0 so the reflected irradiance will be considered
@@ -303,27 +346,26 @@ vec3 inscatterRadiance(vec3 x, inout float t, out float irradianceFactor, vec3 v
}
// cos(PI-thetaH) = dist/r
// cos(thetaH) = - dist/r
// cos(thetaH) = -dist/r
// muHorizon = -sqrt(r^2-Rg^2)/r = -sqrt(1-(Rg/r)^2)
float muHorizon = -sqrt(1.0 - Rg2 / r2);
float muHorizon = -sqrt(1.0 - Rg*Rg / r2);
// In order to avoid imprecision problems near horizon, we interpolate between two
// In order to avoid precision problems near horizon, we interpolate between two
// points: above and below horizon
if (abs(mu - muHorizon) < INTERPOLATION_EPS) {
// We want an interpolation value close to 1/2, so the contribution of each radiance
// value is almost the same or it has a heavy weight if from above or
// below horizon
// value is almost the same or it has a heavy weight if from above or below horizon
float interpolationValue = (mu - muHorizon + INTERPOLATION_EPS) / (2.0 * INTERPOLATION_EPS);
// Above Horizon
mu = muHorizon - INTERPOLATION_EPS;
// r0 = sqrt(r * r + t * t + 2.0f * r * t * mu);
// r0 = sqrt(r * r + t * t + 2.0 * r * t * mu);
// From cosine law where t = distance between x and x0
// r0^2 = r^2 + t^2 - 2 * r * t * cos(PI-theta)
// r0 = sqrt(r2 + t2 + 2.0f * r * t * mu);
float halfCossineLaw1 = r2 + (t * t);
float halfCossineLaw2 = 2.0 * r * t;
r0 = sqrt(halfCossineLaw1 + halfCossineLaw2 * mu);
// r0 = sqrt(r2 + t2 + 2.0 * r * t * mu);
float halfCosineLaw1 = r2 + (t * t);
float halfCosineLaw2 = 2.0 * r * t;
r0 = sqrt(halfCosineLaw1 + halfCosineLaw2 * mu);
// From the dot product: cos(theta0) = (x0 dot v)/(||ro||*||v||)
// mu0 = ((x + t) dot v) / r0
@@ -331,20 +373,24 @@ vec3 inscatterRadiance(vec3 x, inout float t, out float irradianceFactor, vec3 v
// mu0 = (r*mu + t) / r0
mu0 = (r * mu + t) * (1.0 / r0);
vec4 inScatterAboveX = texture4D(inscatterTexture, r, mu, muSun, nu);
vec4 inScatterAboveXs = texture4D(inscatterTexture, r0, mu0, muSun0, nu);
vec4 inScatterAboveX = texture4D(inscatterTexture, r, mu, muSun, nu, Rg,
SAMPLES_MU, Rt, SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU);
vec4 inScatterAboveXs = texture4D(inscatterTexture, r0, mu0, muSun0, nu, Rg,
SAMPLES_MU, Rt, SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU);
// Attention for the attenuation.r value applied to the S_Mie
vec4 inScatterAbove = max(inScatterAboveX - attenuation.rgbr * inScatterAboveXs, 0.0);
// Below Horizon
mu = muHorizon + INTERPOLATION_EPS;
//r0 = sqrt(r2 + t2 + 2.0f * r * t * mu);
r0 = sqrt(halfCossineLaw1 + halfCossineLaw2 * mu);
//r0 = sqrt(r2 + t2 + 2.0 * r * t * mu);
r0 = sqrt(halfCosineLaw1 + halfCosineLaw2 * mu);
mu0 = (r * mu + t) * (1.0 / r0);
vec4 inScatterBelowX = texture4D(inscatterTexture, r, mu, muSun, nu);
vec4 inScatterBelowXs = texture4D(inscatterTexture, r0, mu0, muSun0, nu);
vec4 inScatterBelowX = texture4D(inscatterTexture, r, mu, muSun, nu, Rg,
SAMPLES_MU, Rt, SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU);
vec4 inScatterBelowXs = texture4D(inscatterTexture, r0, mu0, muSun0, nu, Rg,
SAMPLES_MU, Rt, SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU);
// Attention for the attenuation.r value applied to the S_Mie
vec4 inScatterBelow = max(inScatterBelowX - attenuation.rgbr * inScatterBelowXs, 0.0);
@@ -369,13 +415,7 @@ vec3 inscatterRadiance(vec3 x, inout float t, out float irradianceFactor, vec3 v
// Finally we add the Lsun (all calculations are done with no Lsun so we can change it
// on the fly with no precomputations)
vec3 finalScatteringRadiance = radiance * sunIntensity;
if (groundHit) {
return finalScatteringRadiance;
}
else {
return spaceColor.rgb + finalScatteringRadiance;
}
return groundHit ? finalScatteringRadiance : spaceColor + finalScatteringRadiance;
}
/*
@@ -400,8 +440,6 @@ vec3 groundColor(vec3 x, float t, vec3 v, vec3 s, vec3 attenuationXtoX0, vec3 gr
vec3 normal, float irradianceFactor, float waterReflectance,
float sunIntensity)
{
vec3 reflectedRadiance = vec3(0.0);
// First we obtain the ray's end point on the surface
float r0 = length(x + t * v);
@@ -414,7 +452,7 @@ vec3 groundColor(vec3 x, float t, vec3 v, vec3 s, vec3 attenuationXtoX0, vec3 gr
// Is direct Sun light arriving at x0? If not, there is no direct light from Sun (shadowed)
vec3 transmittanceL0 =
muSun < -sqrt(1.0 - (Rg2 / (r0 * r0))) ? vec3(0.0) : transmittance(r0, muSun);
muSun < -sqrt(1.0 - (Rg*Rg / (r0 * r0))) ? vec3(0.0) : transmittance(transmittanceTexture, r0, muSun, Rg, Rt);
// E[L*] at x0
vec3 irradianceReflected = irradiance(irradianceTexture, r0, muSun) * irradianceFactor;
@@ -432,16 +470,14 @@ vec3 groundColor(vec3 x, float t, vec3 v, vec3 s, vec3 attenuationXtoX0, vec3 gr
// Fresnell Schlick's approximation
float fresnel = 0.02 + 0.98 * pow(1.0 - dot(-v, h), 5.0);
// Walter BRDF approximation
float waterBrdf = fresnel * pow(max(dot(h, normal), 0.0), 150.0);
float waterBrdf = max(fresnel * pow(max(dot(h, normal), 0.0), 150.0), 0.0);
// Adding Fresnell and Water BRDFs approximation to the final surface color
// after adding the sunRadiance and the attenuation of the Sun through atmosphere
groundRadiance += waterReflectance * max(waterBrdf, 0.0) * transmittanceL0 * sunIntensity;
groundRadiance += waterReflectance * waterBrdf * transmittanceL0 * sunIntensity;
}
// Finally, we attenuate the surface Radiance from the point x0 to the camera location
reflectedRadiance = attenuationXtoX0 * groundRadiance;
// Returns reflectedRadiance = 0.0 if the ray doesn't hit the ground.
vec3 reflectedRadiance = attenuationXtoX0 * groundRadiance;
return reflectedRadiance;
}
@@ -457,16 +493,23 @@ vec3 groundColor(vec3 x, float t, vec3 v, vec3 s, vec3 attenuationXtoX0, vec3 gr
* attenuation := transmittance T(x,x0)
*/
vec3 sunColor(vec3 v, vec3 s, float r, float mu, float irradianceFactor) {
vec3 tm = vec3(1.0);
if (r <= Rt) {
tm = mu < -sqrt(1.0 - Rg2 / (r * r)) ? vec3(0.0) : transmittance(r, mu);
}
// JCC: Change this function to a impostor texture with gaussian decay color weighted
// by the sunRadiance, transmittance and irradianceColor (11/03/2017)
float sunFinalColor = smoothstep(cos(M_PI / 500.0), cos(M_PI / 900.0), dot(v, s)) *
sunRadiance * (1.0 - irradianceFactor);
// v = normalize(vec3(inverseModelTransformMatrix * dvec4(sunWorld, 1.0)));
float angle = dot(v, s);
return tm * sunFinalColor;
// JCC: Change this function to a impostor texture with gaussian decay color weighted
// by the sunRadiance, transmittance and irradianceColor (11/03/2017)
// @TODO (abock, 2021-07-01) This value is hard-coded to our sun+earth right now
// Convert 0.3 degrees -> radians
const float SunAngularSize = (0.3 * M_PI / 180.0);
const float FuzzyFactor = 0.5; // How fuzzy should the edges be
const float p1 = cos(SunAngularSize);
const float p2 = cos(SunAngularSize * FuzzyFactor);
float t = (angle - p1) / (p2 - p1);
float scale = clamp(t, 0.0, 1.0);
return scale * transmittance(transmittanceTexture, r, mu, Rg, Rt) * sunRadiance * (1.0 - irradianceFactor);
}
void main() {
@@ -481,17 +524,13 @@ void main() {
st.x = st.x / (resolution.x / viewport[2]) + (viewport[0] / resolution.x);
st.y = st.y / (resolution.y / viewport[3]) + (viewport[1] / resolution.y);
// Color from G-Buffer
vec3 color = texture(mainColorTexture, st).rgb;
if (cullAtmosphere == 1) {
renderTarget = texture(mainColorTexture, st);
renderTarget.rgb = color;
return;
}
vec4 atmosphereFinalColor = vec4(0.0);
int nSamples = 1;
// Color from G-Buffer
vec4 color = texture(mainColorTexture, st);
// Get the ray from camera to atm in object space
Ray ray = calculateRayRenderableGlobe(texCoord);
@@ -499,7 +538,7 @@ void main() {
double maxLength = 0.0; // in KM
bool intersect = atmosphereIntersection(ray, Rt - (ATM_EPSILON * 0.001), offset, maxLength);
if (!intersect) {
renderTarget = color;
renderTarget.rgb = color;
return;
}
@@ -509,14 +548,11 @@ void main() {
// Space (View plus Camera Rig Coords) when using their positions later, one must
// convert them to the planet's coords
//
// Get data from G-Buffer
// Normal is stored in SGCT View Space and transformed to the current object space
// Normal is stored in view space and transformed to the current object space
vec4 normalViewSpaceAndWaterReflectance = texture(mainNormalTexture, st);
dvec4 normalViewSpace = vec4(normalViewSpaceAndWaterReflectance.xyz, 0.0);
dvec4 normalWorldSpace = dSGCTViewToWorldMatrix * normalViewSpace;
vec4 normal = vec4(dInverseModelTransformMatrix * normalWorldSpace);
dvec4 normalWorldSpace = viewToWorldMatrix * normalViewSpace;
vec4 normal = vec4(inverseModelTransformMatrix * normalWorldSpace);
normal.xyz = normalize(normal.xyz);
normal.w = normalViewSpaceAndWaterReflectance.w;
@@ -524,19 +560,20 @@ void main() {
vec4 position = texture(mainPositionTexture, st);
// OS Eye to World coords
dvec4 positionWorldCoords = dSGCTViewToWorldMatrix * position;
dvec4 positionWorldCoords = viewToWorldMatrix * position;
// World to Object (Normal and Position in meters)
dvec4 positionObjectsCoords = dInverseModelTransformMatrix * positionWorldCoords;
dvec3 positionObjectsCoords = (inverseModelTransformMatrix * positionWorldCoords).xyz;
// Distance of the pixel in the gBuffer to the observer
// JCC (12/12/2017): AMD distance function is buggy.
//double pixelDepth = distance(cameraPositionInObject.xyz, positionObjectsCoords.xyz);
double pixelDepth = length(dCamPosObj.xyz - positionObjectsCoords.xyz);
double pixelDepth = length(camPosObj - positionObjectsCoords);
// JCC (12/13/2017): Trick to remove floating error in texture.
// We see a squared noise on planet's surface when seeing the planet from far away
float dC = float(length(dCamPosObj.xyz));
// @TODO (abock, 2021-07-01) I don't think this does anything. Remove?
float dC = float(length(camPosObj));
const float x1 = 1e8;
if (dC > x1) {
pixelDepth += 1000.0;
@@ -552,22 +589,21 @@ void main() {
// All calculations are done in KM:
pixelDepth *= 0.001;
positionObjectsCoords.xyz *= 0.001;
positionObjectsCoords *= 0.001;
if (pixelDepth < offset) {
// ATM Occluded - Something in front of ATM
renderTarget = color;
renderTarget.rgb = color;
return;
}
// Following paper nomenclature
double t = offset;
vec3 attenuation;
// Moving observer from camera location to top atmosphere. If the observer is already
// inside the atm, offset = 0.0 and no changes at all
vec3 x = vec3(ray.origin + t * ray.direction);
float r = 0.0; // length(x);
float r = length(x);
vec3 v = vec3(ray.direction);
float mu = 0.0; // dot(x, v) / r;
vec3 s = vec3(sunDirectionObj);
@@ -578,30 +614,31 @@ void main() {
// comparison with the planet's ground make sense:
pixelDepth -= offset;
dvec3 onATMPos = (dModelTransformMatrix * dvec4(x * 1000.0, 1.0)).xyz;
dvec3 onATMPos = (modelTransformMatrix * dvec4(x * 1000.0, 1.0)).xyz;
float eclipseShadowATM = calcShadow(shadowDataArray, onATMPos, false);
float sunIntensityInscatter = sunRadiance * eclipseShadowATM;
float irradianceFactor = 0.0;
bool groundHit = false;
vec3 inscatterColor = inscatterRadiance(x, tF, irradianceFactor, v, s, r, mu,
attenuation, vec3(positionObjectsCoords.xyz), groundHit, maxLength, pixelDepth,
color, sunIntensityInscatter);
vec3 attenuation;
vec3 inscatterColor = inscatterRadiance(x, tF, irradianceFactor, v, s, r,
vec3(positionObjectsCoords), maxLength, pixelDepth, color, sunIntensityInscatter, mu,
attenuation, groundHit);
vec3 atmColor = vec3(0.0);
if (groundHit) {
float eclipseShadowPlanet = calcShadow(shadowDataArray, positionWorldCoords.xyz, true);
float sunIntensityGround = sunRadiance * eclipseShadowPlanet;
atmColor = groundColor(x, tF, v, s, attenuation, color.rgb, normal.xyz,
irradianceFactor, normal.w, sunIntensityGround);
atmColor = groundColor(x, tF, v, s, attenuation, color, normal.xyz, irradianceFactor,
normal.w, sunIntensityGround);
}
else {
// In order to get better performance, we are not tracing multiple rays per pixel
// when the ray doesn't intersect the ground
atmColor = sunColor(v, s, r, mu, irradianceFactor);
atmColor = sunColor(v, s, r, mu, irradianceFactor);
}
// Final Color of ATM plus terrain:
vec4 finalRadiance = vec4(inscatterColor + atmColor, 1.0);
renderTarget = finalRadiance;
renderTarget = vec4(inscatterColor + atmColor, 1.0);;
}

View File

@@ -24,8 +24,8 @@
#version __CONTEXT__
layout(location = 0) in vec3 in_position;
layout(location = 0) in vec2 in_position;
void main() {
gl_Position = vec4(in_position, 1.0);
gl_Position = vec4(in_position, 0.0, 1.0);
}

View File

@@ -24,8 +24,6 @@
#version __CONTEXT__
#include "atmosphere_common.glsl"
out vec4 renderTableColor;
void main() {

View File

@@ -28,19 +28,45 @@
out vec4 renderTarget;
uniform float Rg;
uniform float Rt;
uniform float AverageGroundReflectance;
uniform float HR;
uniform vec3 betaRayleigh;
uniform float HM;
uniform vec3 betaMieScattering;
uniform float mieG;
uniform int SAMPLES_R;
uniform int SAMPLES_MU;
uniform int SAMPLES_MU_S;
uniform int SAMPLES_NU;
uniform sampler2D transmittanceTexture;
uniform float r;
uniform vec4 dhdH;
uniform sampler2D deltaETexture;
uniform sampler3D deltaSRTexture;
uniform sampler3D deltaSMTexture;
uniform int firstIteration;
uniform int firstIteraction;
const int INSCATTER_SPHERICAL_INTEGRAL_SAMPLES = 16;
// -- Spherical Coordinates Steps. phi e [0,2PI] and theta e [0, PI]
const float stepPhi = (2.0 * M_PI) / float(INSCATTER_SPHERICAL_INTEGRAL_SAMPLES);
const float stepTheta = M_PI / float(INSCATTER_SPHERICAL_INTEGRAL_SAMPLES);
// Given the irradiance texture table, the cosine of zenith sun vector and the height of
// the observer (ray's stating point x), calculates the mapping for u_r and u_muSun and
// returns the value in the LUT
// lut := OpenGL texture2D sampler (the irradiance texture deltaE)
// muSun := cosine of the zeith angle of vec(s). Or muSun = (vec(s) * vec(v))
// r := height of starting point vect(x)
vec3 irradianceLUT(sampler2D lut, float muSun, float r) {
// See Bruneton paper and Coliene to understand the mapping
float u_muSun = (muSun + 0.2) / 1.2;
float u_r = (r - Rg) / (Rt - Rg);
return texture(lut, vec2(u_muSun, u_r)).rgb;
}
vec3 inscatter(float r, float mu, float muSun, float nu) {
// Be sure to not get a cosine or height out of bounds
r = clamp(r, Rg, Rt);
@@ -59,7 +85,7 @@ vec3 inscatter(float r, float mu, float muSun, float nu) {
float muSun2 = muSun * muSun;
float sinThetaSinSigma = sqrt(1.0 - mu2) * sqrt(1.0 - muSun2);
// cos(sigma + theta) = cos(theta)cos(sigma)-sin(theta)sin(sigma)
// cos(ni) = nu = mu * muSun - sqrt(1.0f - mu*mu)*sqrt(1.0 - muSun*muSun) // sin(theta) = sqrt(1.0 - mu*mu)
// cos(ni) = nu = mu * muSun - sqrt(1.0 - mu*mu)*sqrt(1.0 - muSun*muSun) // sin(theta) = sqrt(1.0 - mu*mu)
// Now we make sure the angle between vec(s) and vec(v) is in the right range:
nu = clamp(nu, muSun * mu - sinThetaSinSigma, muSun * mu + sinThetaSinSigma);
@@ -69,7 +95,7 @@ vec3 inscatter(float r, float mu, float muSun, float nu) {
// -cos(theta) = sqrt(r*r-Rg*Rg)/r
float Rg2 = Rg * Rg;
float r2 = r * r;
float cosHorizon = -sqrt(r2 - Rg2)/r;
float cosHorizon = -sqrt(r2 - Rg2) / r;
// Now we get vec(v) and vec(s) from mu, muSun and nu:
// Assuming:
@@ -97,6 +123,7 @@ vec3 inscatter(float r, float mu, float muSun, float nu) {
// In order to integrate over 4PI, we scan the sphere using the spherical coordinates
// previously defined
vec3 radianceJAcc = vec3(0.0);
for (int theta_i = 0; theta_i < INSCATTER_SPHERICAL_INTEGRAL_SAMPLES; theta_i++) {
float theta = (float(theta_i) + 0.5) * stepTheta;
float cosineTheta = cos(theta);
@@ -131,7 +158,7 @@ vec3 inscatter(float r, float mu, float muSun, float nu) {
// float muGround = (r2 - distanceToGround*distanceToGround - Rg2)/(2*distanceToGround*Rg);
// Access the Transmittance LUT in order to calculate the transmittance from the
// ground point Rg, thorugh the atmosphere, at a distance: distanceToGround
groundTransmittance = transmittance(Rg, muGround, distanceToGround);
groundTransmittance = transmittance(transmittanceTexture, Rg, muGround, distanceToGround, Rg, Rt);
}
for (int phi_i = 0; phi_i < INSCATTER_SPHERICAL_INTEGRAL_SAMPLES; ++phi_i) {
@@ -161,43 +188,48 @@ vec3 inscatter(float r, float mu, float muSun, float nu) {
// We calculate the Rayleigh and Mie phase function for the new scattering angle:
// cos(angle between vec(s) and vec(w)), ||s|| = ||w|| = 1
float nuSW = dot(s, w);
// The first iteraction is different from the others. In the first iteraction all
// The first iteration is different from the others. In the first iteration all
// the light InScattered is coming from the initial pre-computed single InScattered
// light. We stored these values in the deltaS textures (Ray and Mie), and in order
// to avoid problems with the high angle dependency in the phase functions, we don't
// include the phase functions on those tables (that's why we calculate them now).
if (firstIteraction == 1) {
if (firstIteration == 1) {
float phaseRaySW = rayleighPhaseFunction(nuSW);
float phaseMieSW = miePhaseFunction(nuSW, mieG);
// We can now access the values for the single InScattering in the textures deltaS textures.
vec3 singleRay = texture4D(deltaSRTexture, r, w.z, muSun, nuSW).rgb;
vec3 singleMie = texture4D(deltaSMTexture, r, w.z, muSun, nuSW).rgb;
vec3 singleRay = texture4D(deltaSRTexture, r, w.z, muSun, nuSW, Rg, SAMPLES_MU,
Rt, SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU).rgb;
vec3 singleMie = texture4D(deltaSMTexture, r, w.z, muSun, nuSW, Rg, SAMPLES_MU,
Rt, SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU).rgb;
// Initial InScattering including the phase functions
radianceJ1 += singleRay * phaseRaySW + singleMie * phaseMieSW;
}
else {
// On line 9 of the algorithm, the texture table deltaSR is updated, so when we
// are not in the first iteraction, we are getting the updated result of deltaSR
// are not in the first iteration, we are getting the updated result of deltaSR
// (not the single inscattered light but the accumulated (higher order)
// inscattered light.
// w.z is the cosine(theta) = mu for vec(w)
radianceJ1 += texture4D(deltaSRTexture, r, w.z, muSun, nuSW).rgb;
radianceJ1 += texture4D(deltaSRTexture, r, w.z, muSun, nuSW, Rg, SAMPLES_MU, Rt,
SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU).rgb;
}
// Finally, we add the atmospheric scale height (See: Radiation Transfer on the
// Atmosphere and Ocean from Thomas and Stamnes, pg 9-10.
return radianceJ1 * (betaRayleigh * exp(-(r - Rg) / HR) * phaseRayleighWV +
radianceJAcc += radianceJ1 * (betaRayleigh * exp(-(r - Rg) / HR) * phaseRayleighWV +
betaMieScattering * exp(-(r - Rg) / HM) * phaseMieWV) * dw;
}
}
return radianceJAcc;
}
void main() {
// InScattering Radiance to be calculated at different points in the ray path
// Unmapping the variables from texture texels coordinates to mapped coordinates
float mu, muSun, nu;
unmappingMuMuSunNu(r, dhdH, mu, muSun, nu);
unmappingMuMuSunNu(r, dhdH, SAMPLES_MU, Rg, Rt, SAMPLES_MU_S, SAMPLES_NU, mu, muSun, nu);
// Calculate the the light inScattered in direction
// -vec(v) for the point at height r (vec(y) following Bruneton and Neyret's paper

View File

@@ -28,6 +28,10 @@
out vec4 renderTarget;
uniform int SAMPLES_R;
uniform int SAMPLES_MU;
uniform int SAMPLES_MU_S;
uniform int SAMPLES_NU;
uniform int layer;
uniform sampler3D deltaSRTexture;
uniform sampler3D deltaSMTexture;

View File

@@ -28,6 +28,10 @@
out vec4 renderTarget;
uniform int SAMPLES_R;
uniform int SAMPLES_MU;
uniform int SAMPLES_MU_S;
uniform int SAMPLES_NU;
uniform int layer;
uniform sampler3D deltaSTexture;
@@ -35,10 +39,9 @@ void main() {
vec2 p = gl_FragCoord.xy - vec2(0.5);
float nu = -1.0 + floor(p.x / float(SAMPLES_MU_S)) / (float(SAMPLES_NU) - 1.0) * 2.0;
vec3 uvw = vec3(
gl_FragCoord.xy,
float(layer) + 0.5) / vec3(ivec3(SAMPLES_MU_S * SAMPLES_NU, SAMPLES_MU, SAMPLES_R)
);
vec3 uvw =
vec3(gl_FragCoord.xy, float(layer) + 0.5) /
vec3(ivec3(SAMPLES_MU_S * SAMPLES_NU, SAMPLES_MU, SAMPLES_R));
// See Bruneton and Neyret paper, "Angular Precision" paragraph to understanding why we
// are dividing the S[L*] by the Rayleigh phase function.

View File

@@ -29,6 +29,18 @@
layout(location = 0) out vec4 renderTarget1;
layout(location = 1) out vec4 renderTarget2;
uniform float Rg;
uniform float Rt;
uniform float HR;
uniform vec3 betaRayleigh;
uniform float HO;
uniform float HM;
uniform vec3 betaMieScattering;
uniform bool ozoneLayerEnabled;
uniform int SAMPLES_MU;
uniform int SAMPLES_MU_S;
uniform int SAMPLES_NU;
uniform sampler2D transmittanceTexture;
uniform float r;
uniform vec4 dhdH;
@@ -60,7 +72,9 @@ void integrand(float r, float mu, float muSun, float nu, float y, out vec3 S_R,
if (muSun_i >= -sqrt(1.0 - Rg * Rg / (ri * ri))) {
// It's the transmittance from the point y (ri) to the top of atmosphere in direction
// of the sun (muSun_i) and the transmittance from the observer at x (r) to y (ri).
vec3 transmittanceY = transmittance(r, mu, y) * transmittance(ri, muSun_i);
vec3 transmittanceY =
transmittance(transmittanceTexture, r, mu, y, Rg, Rt) *
transmittance(transmittanceTexture, ri, muSun_i, Rg, Rt);
// exp(-h/H)*T(x,v)
if (ozoneLayerEnabled) {
S_R = (exp(-(ri - Rg) / HO) + exp(-(ri - Rg) / HR)) * transmittanceY;
@@ -83,7 +97,7 @@ void inscatter(float r, float mu, float muSun, float nu, out vec3 S_R, out vec3
S_R = vec3(0.0);
S_M = vec3(0.0);
float rayDist = rayDistance(r, mu);
float rayDist = rayDistance(r, mu, Rt, Rg);
float dy = rayDist / float(INSCATTER_INTEGRAL_SAMPLES);
vec3 S_Ri;
vec3 S_Mi;
@@ -103,13 +117,10 @@ void inscatter(float r, float mu, float muSun, float nu, out vec3 S_R, out vec3
}
void main() {
vec3 S_R; // First Order Rayleigh InScattering
vec3 S_M; // First Order Mie InScattering
// From the layer interpolation (see C++ code for layer to r) and the textures
// parameters (uv), we unmapping mu, muSun and nu.
float mu, muSun, nu;
unmappingMuMuSunNu(r, dhdH, mu, muSun, nu);
unmappingMuMuSunNu(r, dhdH, SAMPLES_MU, Rg, Rt, SAMPLES_MU_S, SAMPLES_NU, mu, muSun, nu);
// Here we calculate the single inScattered light. Because this is a single
// inscattering, the light that arrives at a point y in the path from the eye to the
@@ -122,6 +133,8 @@ void main() {
// S[L0] = P_R*S_R[L0] + P_M*S_M[L0]
// In order to save memory, we just store the red component of S_M[L0], and later we use
// the proportionality rule to calcule the other components.
vec3 S_R; // First Order Rayleigh InScattering
vec3 S_M; // First Order Mie InScattering
inscatter(r, mu, muSun, nu, S_R, S_M);
renderTarget1 = vec4(S_R, 1.0);
renderTarget2 = vec4(S_M, 1.0);

View File

@@ -28,6 +28,13 @@
out vec4 renderTarget;
uniform float Rg;
uniform float Rt;
uniform int SAMPLES_R;
uniform int SAMPLES_MU;
uniform int SAMPLES_MU_S;
uniform int SAMPLES_NU;
uniform sampler2D transmittanceTexture;
uniform float r;
uniform vec4 dhdH;
uniform sampler3D deltaJTexture;
@@ -35,7 +42,7 @@ uniform sampler3D deltaJTexture;
// The integrand here is the f(y) of the trapezoidal rule:
vec3 integrand(float r, float mu, float muSun, float nu, float dist) {
// We can calculate r_i by the cosine law: r_i^2=dist^2 + r^2 - 2*r*dist*cos(PI-theta)
float r_i = sqrt(r * r + dist * dist + 2.0f * r * dist * mu);
float r_i = sqrt(r * r + dist * dist + 2.0 * r * dist * mu);
// r_i can be found using the dot product:
// vec(y_i) dot vec(dist) = cos(theta_i) * ||vec(y_i)|| * ||vec(dist)||
// But vec(y_i) = vec(x) + vec(dist), also: vec(x) dot vec(dist) = cos(theta) = mu
@@ -46,12 +53,15 @@ vec3 integrand(float r, float mu, float muSun, float nu, float dist) {
// But vec(y_i) = vec(x) + vec(dist), and vec(x) dot vec(s) = muSun, cos(sigma_i + theta_i) = nu
float muSun_i = (r * muSun + dist * nu) / r_i;
// The irradiance attenuated from point r until y (y-x = dist)
return transmittance(r, mu, dist) * texture4D(deltaJTexture, r_i, mu_i, muSun_i, nu).rgb;
return
transmittance(transmittanceTexture, r, mu, dist, Rg, Rt) *
texture4D(deltaJTexture, r_i, mu_i, muSun_i, nu, Rg, SAMPLES_MU, Rt, SAMPLES_R,
SAMPLES_MU_S, SAMPLES_NU).rgb;
}
vec3 inscatter(float r, float mu, float muSun, float nu) {
vec3 inScatteringRadiance = vec3(0.0);
float dy = rayDistance(r, mu) / float(INSCATTER_INTEGRAL_SAMPLES);
float dy = rayDistance(r, mu, Rt, Rg) / float(INSCATTER_INTEGRAL_SAMPLES);
vec3 inScatteringRadiance_i = integrand(r, mu, muSun, nu, 0.0);
// In order to solve the integral from equation (11) we use the trapezoidal rule:
@@ -71,7 +81,7 @@ void main() {
float muSun = 0.0;
float nu = 0.0;
// Unmapping the variables from texture texels coordinates to mapped coordinates
unmappingMuMuSunNu(r, dhdH, mu, muSun, nu);
unmappingMuMuSunNu(r, dhdH, SAMPLES_MU, Rg, Rt, SAMPLES_MU_S, SAMPLES_NU, mu, muSun, nu);
// Write to texture deltaSR
renderTarget = vec4(inscatter(r, mu, muSun, nu), 1.0);

View File

@@ -28,16 +28,24 @@
out vec4 renderTableColor;
void main() {
// See Bruneton and Colliene to understand the mapping
float muSun = -0.2 + (gl_FragCoord.x - 0.5) / (float(OTHER_TEXTURES.x) - 1.0) * 1.2;
float r = Rg + (gl_FragCoord.y - 0.5) / (float(OTHER_TEXTURES.y) ) * RtMinusRg;
uniform float Rg;
uniform float Rt;
uniform ivec2 OTHER_TEXTURES;
uniform sampler2D transmittanceTexture;
// We are calculating the Irradiance for L0, i.e., only the radiance coming from Sun
// direction is accounted:
void main() {
// See Bruneton and Collienne to understand the mapping
float muSun = -0.2 + (gl_FragCoord.x - 0.5) / (float(OTHER_TEXTURES.x) - 1.0) * 1.2;
float r = Rg + (gl_FragCoord.y - 0.5) / (float(OTHER_TEXTURES.y)) * (Rt - Rg);
// We are calculating the Irradiance for L0, i.e., only the radiance coming from the Sun
// direction is accounted for:
// E[L0](x,s) = L0*dot(w,n) or 0 (if v!=s or the sun is occluded).
// Because we consider the Planet as a perfect sphere and we are considering only single
// Because we consider the planet as a perfect sphere and we are considering only single
// scattering here, the dot product dot(w,n) is equal to dot(s,n) that is equal to
// dot(s, r/||r||) = muSun.
renderTableColor = vec4(transmittance(r, muSun) * max(muSun, 0.0), 0.0);
renderTableColor = vec4(
transmittance(transmittanceTexture, r, muSun, Rg, Rt) * max(muSun, 0.0),
0.0
);
}

View File

@@ -24,10 +24,9 @@
#version __CONTEXT__
#include "atmosphere_common.glsl"
out vec4 renderTableColor;
uniform ivec2 OTHER_TEXTURES;
uniform sampler2D deltaETexture;
void main() {

View File

@@ -28,22 +28,32 @@
out vec4 renderTableColor;
uniform int firstIteraction;
uniform float Rg;
uniform float Rt;
uniform float mieG;
uniform ivec2 SKY;
uniform int SAMPLES_R;
uniform int SAMPLES_MU;
uniform int SAMPLES_MU_S;
uniform int SAMPLES_NU;
uniform int firstIteration;
uniform sampler3D deltaSRTexture;
uniform sampler3D deltaSMTexture;
const int IRRADIANCE_INTEGRAL_SAMPLES = 32;
// Spherical Coordinates Steps. phi e [0,2PI] and theta e [0, PI/2]
const float stepPhi = (2.0 * M_PI) / float(IRRADIANCE_INTEGRAL_SAMPLES);
const float stepTheta = M_PI / (2.0 * float(IRRADIANCE_INTEGRAL_SAMPLES));
void main() {
// See Bruneton and Colliene to understand the mapping.
// See Bruneton and Collienne to understand the mapping.
float muSun = -0.2 + (gl_FragCoord.x - 0.5) / (float(SKY.x) - 1.0) * 1.2;
float r = Rg + (gl_FragCoord.y - 0.5) / (float(SKY.y) - 1.0) * RtMinusRg;
float r = Rg + (gl_FragCoord.y - 0.5) / (float(SKY.y) - 1.0) * (Rt - Rg);
// We know that muSun = cos(sigma) = s.z/||s||
// But, ||s|| = 1, so s.z = muSun. Also,
// ||s|| = 1, so s.x = sin(sigma) = sqrt(1-muSun^2) and s.y = 0.0f
// ||s|| = 1, so s.x = sin(sigma) = sqrt(1-muSun^2) and s.y = 0.0
vec3 s = vec3(max(sqrt(1.0 - muSun * muSun), 0.0), 0.0, muSun);
// In order to solve the integral from equation (15) we use the trapezoidal rule:
@@ -60,30 +70,32 @@ void main() {
vec3 w = vec3(cos(phi) * sin(theta), sin(phi) * sin(theta), cos(theta));
float nu = dot(s, w);
// The first iteraction is different from the others, that's because in the first
// iteraction all the light arriving are coming from the initial pre-computed
// single scattered light. We stored these values in the deltaS textures (Ray and
// Mie), and in order to avoid problems with the high angle dependency in the phase
// functions, we don't include the phase functions on those tables (that's why we
// calculate them now)
if (firstIteraction == 1) {
// The first iteration is different from the others as in the first iteration all
// the light arriving is coming from the initial pre-computed single scattered
// light. We stored these values in the deltaS textures (Ray and Mie), and in order
// to avoid problems with the high angle dependency in the phase functions, we don't
// include the phase functions on those tables (that's why we calculate them now)
if (firstIteration == 1) {
float phaseRay = rayleighPhaseFunction(nu);
float phaseMie = miePhaseFunction(nu, mieG);
vec3 singleRay = texture4D(deltaSRTexture, r, w.z, muSun, nu).rgb;
vec3 singleMie = texture4D(deltaSMTexture, r, w.z, muSun, nu).rgb;
vec3 singleRay = texture4D(deltaSRTexture, r, w.z, muSun, nu, Rg, SAMPLES_MU, Rt,
SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU).rgb;
vec3 singleMie = texture4D(deltaSMTexture, r, w.z, muSun, nu, Rg, SAMPLES_MU, Rt,
SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU).rgb;
// w.z is the cosine(theta) = mu for vec(w) and also vec(w) dot vec(n(xo))
irradianceE += (singleRay * phaseRay + singleMie * phaseMie) * w.z * dw;
}
else {
// On line 10 of the algorithm, the texture table deltaE is updated, so when we
// are not in the first iteraction, we are getting the updated result of deltaE
// are not in the first iteration, we are getting the updated result of deltaE
// (not the single irradiance light but the accumulated (higher order) irradiance
// light. w.z is the cosine(theta) = mu for vec(w) and also vec(w) dot vec(n(xo))
irradianceE += texture4D(deltaSRTexture, r, w.z, muSun, nu).rgb * w.z * dw;
irradianceE += texture4D(deltaSRTexture, r, w.z, muSun, nu, Rg, SAMPLES_MU, Rt,
SAMPLES_R, SAMPLES_MU_S, SAMPLES_NU).rgb * w.z * dw;
}
}
}
// Write the higher oder irradiance to texture deltaE
// Write the higher order irradiance to texture deltaE
renderTableColor = vec4(irradianceE, 0.0);
}

View File

@@ -28,6 +28,19 @@
out vec4 renderTableColor;
uniform float Rg;
uniform float Rt;
uniform float HR;
uniform vec3 betaRayleigh;
uniform float HO;
uniform vec3 betaOzoneExtinction;
uniform float HM;
uniform vec3 betaMieExtinction;
uniform bool ozoneLayerEnabled;
uniform ivec2 TRANSMITTANCE;
const int TRANSMITTANCE_STEPS = 500;
// Optical depth by integration, from ray starting at point vec(x), i.e, height r and
// angle mu (cosine of vec(v)) until top of atmosphere or planet's ground.
// r := height of starting point vect(x)
@@ -42,14 +55,14 @@ float opticalDepth(float r, float mu, float H) {
// direction and starting and ending points.
// cosine law for triangles: y_i^2 = a^2 + b^2 - 2abcos(alpha)
float cosZenithHorizon = -sqrt(1.0 - (Rg * Rg / r2));
float cosZenithHorizon = -sqrt(1.0 - ((Rg * Rg) / r2));
if (mu < cosZenithHorizon) {
return 1e9;
}
// Integrating using the Trapezoidal rule:
// Integral(f(y)dy)(from a to b) = ((b-a)/2n_steps)*(Sum(f(y_i+1)+f(y_i)))
float b_a = rayDistance(r, mu);
float b_a = rayDistance(r, mu, Rt, Rg);
float deltaStep = b_a / float(TRANSMITTANCE_STEPS);
// cosine law
float y_i = exp(-(r - Rg) / H);
@@ -72,11 +85,11 @@ void main() {
// In the paper u_r^2 = (r^2-Rg^2)/(Rt^2-Rg^2)
// So, extracting r from u_r in the above equation:
float r = Rg + (u_r * u_r) * RtMinusRg;
float r = Rg + (u_r * u_r) * (Rt - Rg);
// In the paper the Bruneton suggest mu = dot(v,x)/||x|| with ||v|| = 1.0
// Later he proposes u_mu = (1-exp(-3mu-0.6))/(1-exp(-3.6))
// But the below one is better. See Colliene.
// But the below one is better. See Collienne.
// One must remember that mu is defined from 0 to PI/2 + epsilon
float muSun = -0.15 + tan(1.5 * u_mu) / tan(1.5) * 1.15;

View File

@@ -219,6 +219,7 @@ std::vector<documentation::Documentation> BaseModule::documentations() const {
ScreenSpaceImageLocal::Documentation(),
ScreenSpaceImageOnline::Documentation(),
ConstantRotation::Documentation(),
FixedRotation::Documentation(),
LuaRotation::Documentation(),
StaticRotation::Documentation(),

View File

@@ -230,11 +230,11 @@ void RenderableSphericalGrid::update(const UpdateData&) {
normal = glm::normalize(normal);
}
glm::vec4 tmp(x, y, z, 1);
glm::vec4 tmp(x, y, z, 1.f);
glm::mat4 rot = glm::rotate(
glm::mat4(1),
glm::mat4(1.f),
glm::half_pi<float>(),
glm::vec3(1, 0, 0)
glm::vec3(1.f, 0.f, 0.f)
);
tmp = glm::vec4(glm::dmat4(rot) * glm::dvec4(tmp));

View File

@@ -118,14 +118,7 @@ void RenderableCartesianAxes::initializeGL() {
);
glGenVertexArrays(1, &_vaoId);
glGenBuffers(1, &_vBufferId);
glGenBuffers(1, &_iBufferId);
glBindVertexArray(_vaoId);
glBindBuffer(GL_ARRAY_BUFFER, _vBufferId);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _iBufferId);
glEnableVertexAttribArray(0);
glBindVertexArray(0);
std::vector<Vertex> vertices({
Vertex{0.f, 0.f, 0.f},
@@ -140,7 +133,7 @@ void RenderableCartesianAxes::initializeGL() {
0, 3
};
glBindVertexArray(_vaoId);
glGenBuffers(1, &_vBufferId);
glBindBuffer(GL_ARRAY_BUFFER, _vBufferId);
glBufferData(
GL_ARRAY_BUFFER,
@@ -149,8 +142,10 @@ void RenderableCartesianAxes::initializeGL() {
GL_STATIC_DRAW
);
glEnableVertexAttribArray(0);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), nullptr);
glGenBuffers(1, &_iBufferId);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _iBufferId);
glBufferData(
GL_ELEMENT_ARRAY_BUFFER,
@@ -158,6 +153,7 @@ void RenderableCartesianAxes::initializeGL() {
indices.data(),
GL_STATIC_DRAW
);
glBindVertexArray(0);
}
void RenderableCartesianAxes::deinitializeGL() {
@@ -201,9 +197,9 @@ void RenderableCartesianAxes::render(const RenderData& data, RendererTasks&){
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnablei(GL_BLEND, 0);
glEnable(GL_LINE_SMOOTH);
glLineWidth(3.0);
glBindVertexArray(_vaoId);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, _iBufferId);
glDrawElements(GL_LINES, NVertexIndices, GL_UNSIGNED_INT, nullptr);
glBindVertexArray(0);

View File

@@ -224,21 +224,21 @@ RenderableLabels::RenderableLabels(const ghoul::Dictionary& dictionary)
: Renderable(dictionary)
, _blendMode(BlendModeInfo, properties::OptionProperty::DisplayType::Dropdown)
, _color(ColorInfo, glm::vec3(1.f), glm::vec3(0.f), glm::vec3(1.f))
, _size(SizeInfo, 8.f, 0.5f, 30.f)
, _fontSize(FontSizeInfo, 50.f, 1.f, 100.f)
, _size(SizeInfo, 8.f, 0.5f, 30.f)
, _minMaxSize(MinMaxSizeInfo, glm::ivec2(8, 20), glm::ivec2(0), glm::ivec2(100))
, _enableFadingEffect(EnableFadingEffectInfo, false)
, _text(TextInfo, "")
, _fadeDistances(FadeDistancesInfo, glm::vec2(1.f), glm::vec2(0.f), glm::vec2(100.f))
, _enableFadingEffect(EnableFadingEffectInfo, false)
, _fadeWidths(FadeWidthsInfo, glm::vec2(1.f), glm::vec2(0.f), glm::vec2(100.f))
, _orientationOption(
OrientationOptionInfo,
properties::OptionProperty::DisplayType::Dropdown
)
, _fadeDistances(FadeDistancesInfo, glm::vec2(1.f), glm::vec2(0.f), glm::vec2(100.f))
, _fadeUnitOption(
FadeUnitOptionInfo,
properties::OptionProperty::DisplayType::Dropdown
)
, _orientationOption(
OrientationOptionInfo,
properties::OptionProperty::DisplayType::Dropdown
)
{
const Parameters p = codegen::bake<Parameters>(dictionary);

View File

@@ -270,21 +270,7 @@ void RenderablePlane::render(const RenderData& data, RendererTasks&) {
_shader->setUniform("multiplyColor", _multiplyColor);
bool usingFramebufferRenderer = global::renderEngine->rendererImplementation() ==
RenderEngine::RendererImplementation::Framebuffer;
bool usingABufferRenderer = global::renderEngine->rendererImplementation() ==
RenderEngine::RendererImplementation::ABuffer;
if (usingABufferRenderer) {
_shader->setUniform(
"additiveBlending",
_blendMode == static_cast<int>(BlendMode::Additive)
);
}
bool additiveBlending =
(_blendMode == static_cast<int>(BlendMode::Additive)) && usingFramebufferRenderer;
bool additiveBlending = (_blendMode == static_cast<int>(BlendMode::Additive));
if (additiveBlending) {
glDepthMask(false);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);

View File

@@ -196,7 +196,7 @@ void RenderablePlaneImageLocal::loadTexture() {
LDEBUGC(
"RenderablePlaneImageLocal",
fmt::format("Loaded texture from '{}'", absPath(path))
fmt::format("Loaded texture from {}", absPath(path))
);
texture->uploadTexture();
texture->setFilter(ghoul::opengl::Texture::FilterMode::LinearMipMap);

View File

@@ -96,53 +96,55 @@ void RenderablePlaneImageOnline::bindTexture() {
}
void RenderablePlaneImageOnline::update(const UpdateData&) {
if (_textureIsDirty) {
if (!_imageFuture.valid()) {
std::future<DownloadManager::MemoryFile> future = downloadImageToMemory(
_texturePath
if (!_textureIsDirty) {
return;
}
if (!_imageFuture.valid()) {
std::future<DownloadManager::MemoryFile> future = downloadImageToMemory(
_texturePath
);
if (future.valid()) {
_imageFuture = std::move(future);
}
}
if (_imageFuture.valid() && DownloadManager::futureReady(_imageFuture)) {
DownloadManager::MemoryFile imageFile = _imageFuture.get();
if (imageFile.corrupted) {
LERRORC(
"ScreenSpaceImageOnline",
fmt::format("Error loading image from URL '{}'", _texturePath)
);
if (future.valid()) {
_imageFuture = std::move(future);
}
return;
}
if (_imageFuture.valid() && DownloadManager::futureReady(_imageFuture)) {
DownloadManager::MemoryFile imageFile = _imageFuture.get();
if (imageFile.corrupted) {
LERRORC(
"ScreenSpaceImageOnline",
fmt::format("Error loading image from URL '{}'", _texturePath)
try {
std::unique_ptr<ghoul::opengl::Texture> texture =
ghoul::io::TextureReader::ref().loadTexture(
reinterpret_cast<void*>(imageFile.buffer),
imageFile.size,
imageFile.format
);
return;
}
try {
std::unique_ptr<ghoul::opengl::Texture> texture =
ghoul::io::TextureReader::ref().loadTexture(
reinterpret_cast<void*>(imageFile.buffer),
imageFile.size,
imageFile.format
);
if (texture) {
// Images don't need to start on 4-byte boundaries, for example if the
// image is only RGB
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
if (texture) {
// Images don't need to start on 4-byte boundaries, for example if the
// image is only RGB
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
texture->uploadTexture();
texture->setFilter(ghoul::opengl::Texture::FilterMode::LinearMipMap);
texture->purgeFromRAM();
texture->uploadTexture();
texture->setFilter(ghoul::opengl::Texture::FilterMode::LinearMipMap);
texture->purgeFromRAM();
_texture = std::move(texture);
_textureIsDirty = false;
}
}
catch (const ghoul::io::TextureReader::InvalidLoadException& e) {
_texture = std::move(texture);
_textureIsDirty = false;
LERRORC(e.component, e.message);
}
}
catch (const ghoul::io::TextureReader::InvalidLoadException& e) {
_textureIsDirty = false;
LERRORC(e.component, e.message);
}
}
}

View File

@@ -215,7 +215,10 @@ void RenderablePrism::updateVertexData() {
for (int i = 0; i < 2; ++i) {
float h = i * _length; // z value, 0 to _length
for (int j = 0, k = 0; j < _nShapeSegments && k < unitVertices.size(); ++j, k += 2) {
for (int j = 0, k = 0;
j < _nShapeSegments && k < static_cast<int>(unitVertices.size());
++j, k += 2)
{
float ux = unitVertices[k];
float uy = unitVertices[k + 1];
@@ -239,7 +242,10 @@ void RenderablePrism::updateVertexData() {
_vertexArray.push_back(_length);
}
else {
for (int j = 0, k = 0; j < _nLines && k < unitVerticesLines.size(); ++j, k += 2) {
for (int j = 0, k = 0;
j < _nLines && k < static_cast<int>(unitVerticesLines.size());
++j, k += 2)
{
float ux = unitVerticesLines[k];
float uy = unitVerticesLines[k + 1];
@@ -268,8 +274,8 @@ void RenderablePrism::updateVertexData() {
_indexArray.push_back(255);
// Indices for Top shape
for (uint8_t i = _nShapeSegments; i < 2 * _nShapeSegments; ++i) {
_indexArray.push_back(i);
for (int i = _nShapeSegments; i < 2 * _nShapeSegments; ++i) {
_indexArray.push_back(static_cast<uint8_t>(i));
}
// Indices for connecting lines
@@ -277,8 +283,8 @@ void RenderablePrism::updateVertexData() {
// Reset
_indexArray.push_back(255);
_indexArray.push_back(2 * _nShapeSegments + k);
_indexArray.push_back(2 * _nShapeSegments + k + 1);
_indexArray.push_back(static_cast<uint8_t>(2 * _nShapeSegments + k));
_indexArray.push_back(static_cast<uint8_t>(2 * _nShapeSegments + k + 1));
}
}

View File

@@ -398,24 +398,14 @@ void RenderableSphere::render(const RenderData& data, RendererTasks&) {
glDisable(GL_CULL_FACE);
}
bool usingFramebufferRenderer = global::renderEngine->rendererImplementation() ==
RenderEngine::RendererImplementation::Framebuffer;
bool usingABufferRenderer = global::renderEngine->rendererImplementation() ==
RenderEngine::RendererImplementation::ABuffer;
if (usingABufferRenderer && _useAdditiveBlending) {
_shader->setUniform("additiveBlending", true);
}
if (usingFramebufferRenderer && _useAdditiveBlending) {
if (_useAdditiveBlending) {
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
glDepthMask(false);
}
_sphere->render();
if (usingFramebufferRenderer && _useAdditiveBlending) {
if (_useAdditiveBlending) {
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDepthMask(true);
}
@@ -452,7 +442,7 @@ void RenderableSphere::loadTexture() {
if (texture) {
LDEBUGC(
"RenderableSphere",
fmt::format("Loaded texture from '{}'", absPath(_texturePath))
fmt::format("Loaded texture from {}", absPath(_texturePath))
);
texture->uploadTexture();
texture->setFilter(ghoul::opengl::Texture::FilterMode::LinearMipMap);

View File

@@ -430,14 +430,8 @@ void RenderableTrail::render(const RenderData& data, RendererTasks&) {
/*glm::ivec2 resolution = global::renderEngine.renderingResolution();
_programObject->setUniform(_uniformCache.resolution, resolution);*/
const bool usingFramebufferRenderer =
global::renderEngine->rendererImplementation() ==
RenderEngine::RendererImplementation::Framebuffer;
if (usingFramebufferRenderer) {
glDepthMask(false);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
}
glDepthMask(false);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
const bool renderLines = (_appearance.renderingModes == RenderingModeLines) ||
(_appearance.renderingModes == RenderingModeLinesPoints);
@@ -508,10 +502,8 @@ void RenderableTrail::render(const RenderData& data, RendererTasks&) {
glBindVertexArray(0);
if (usingFramebufferRenderer) {
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDepthMask(true);
}
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDepthMask(true);
_programObject->deactivate();
}

View File

@@ -158,10 +158,12 @@ RenderableTrailOrbit::RenderableTrailOrbit(const ghoul::Dictionary& dictionary)
using namespace std::chrono;
_period = p.period * duration_cast<seconds>(hours(24)).count();
_period.onChange([&] { _needsFullSweep = true; _indexBufferDirty = true; });
_period.setExponent(5.f);
addProperty(_period);
_resolution = p.resolution;
_resolution.onChange([&] { _needsFullSweep = true; _indexBufferDirty = true; });
_resolution.setExponent(3.5f);
addProperty(_resolution);
// We store the vertices with (excluding the wrapping) decending temporal order

View File

@@ -80,7 +80,7 @@ ScreenSpaceFramebuffer::ScreenSpaceFramebuffer(const ghoul::Dictionary& dictiona
glm::vec2 resolution = global::windowDelegate->currentDrawBufferResolution();
addProperty(_size);
_size.set(glm::vec4(0, 0, resolution.x,resolution.y));
_size.set(glm::vec4(0.f, 0.f, resolution.x, resolution.y));
}
ScreenSpaceFramebuffer::~ScreenSpaceFramebuffer() {} // NOLINT

View File

@@ -79,7 +79,7 @@ LuaRotation::LuaRotation(const ghoul::Dictionary& dictionary) : LuaRotation() {
}
glm::dmat3 LuaRotation::matrix(const UpdateData& data) const {
ghoul::lua::runScriptFile(_state, _luaScriptFile);
ghoul::lua::runScriptFile(_state, _luaScriptFile.value());
// Get the scaling function
lua_getglobal(_state, "rotation");
@@ -87,7 +87,9 @@ glm::dmat3 LuaRotation::matrix(const UpdateData& data) const {
if (!isFunction) {
LERRORC(
"LuaRotation",
fmt::format("Script '{}' does nto have a function 'rotation'", _luaScriptFile)
fmt::format(
"Script '{}' does not have a function 'rotation'", _luaScriptFile.value()
)
);
return glm::dmat3(1.0);
}

View File

@@ -77,7 +77,7 @@ LuaScale::LuaScale(const ghoul::Dictionary& dictionary) : LuaScale() {
}
glm::dvec3 LuaScale::scaleValue(const UpdateData& data) const {
ghoul::lua::runScriptFile(_state, _luaScriptFile);
ghoul::lua::runScriptFile(_state, _luaScriptFile.value());
// Get the scaling function
lua_getglobal(_state, "scale");
@@ -85,7 +85,9 @@ glm::dvec3 LuaScale::scaleValue(const UpdateData& data) const {
if (!isFunction) {
LERRORC(
"LuaScale",
fmt::format("Script '{}' does not have a function 'scale'", _luaScriptFile)
fmt::format(
"Script '{}' does not have a function 'scale'", _luaScriptFile.value()
)
);
return glm::dvec3(1.0);
}

View File

@@ -33,17 +33,20 @@ uniform vec3 yColor;
uniform vec3 zColor;
Fragment getFragment() {
Fragment frag;
Fragment frag;
vec3 colorComponents = step(0.01, vs_positionModelSpace);
// We compare against a small value as the first vertex doesn't have a positional
// information (or rather it is 0) and we don't want to miss out on the color close to
// the origin
vec3 colorComponents = step(2e-32, vs_positionModelSpace);
frag.color.rgb = colorComponents.x * xColor +
colorComponents.y * yColor +
colorComponents.z * zColor;
frag.color.a = 1.0;
frag.color.rgb = colorComponents.x * xColor +
colorComponents.y * yColor +
colorComponents.z * zColor;
frag.color.a = 1.0;
frag.depth = vs_screenSpaceDepth;
frag.gPosition = vs_positionViewSpace;
frag.gNormal = vec4(0.0, 0.0, 0.0, 1.0);
return frag;
frag.depth = vs_screenSpaceDepth;
frag.gPosition = vs_positionViewSpace;
frag.gNormal = vec4(0.0, 0.0, 0.0, 1.0);
return frag;
}

View File

@@ -34,13 +34,13 @@ uniform mat4 modelViewTransform;
uniform mat4 projectionTransform;
void main() {
vec4 positionViewSpace = modelViewTransform * vec4(in_position, 1.0);
vec4 positionClipSpace = projectionTransform * positionViewSpace;
vec4 positionScreenSpace = positionClipSpace;
positionScreenSpace.z = 0.0;
vs_positionModelSpace = in_position;
vs_screenSpaceDepth = positionScreenSpace.w;
vs_positionViewSpace = positionViewSpace;
vec4 positionViewSpace = modelViewTransform * vec4(in_position, 1.0);
vec4 positionClipSpace = projectionTransform * positionViewSpace;
vec4 positionScreenSpace = positionClipSpace;
positionScreenSpace.z = 0.0;
vs_positionModelSpace = in_position;
vs_screenSpaceDepth = positionScreenSpace.w;
vs_positionViewSpace = positionViewSpace;
gl_Position = positionScreenSpace;
gl_Position = positionScreenSpace;
}

View File

@@ -81,7 +81,7 @@ LuaTranslation::LuaTranslation(const ghoul::Dictionary& dictionary) : LuaTransla
}
glm::dvec3 LuaTranslation::position(const UpdateData& data) const {
ghoul::lua::runScriptFile(_state, _luaScriptFile);
ghoul::lua::runScriptFile(_state, _luaScriptFile.value());
// Get the scaling function
lua_getglobal(_state, "translation");
@@ -91,7 +91,7 @@ glm::dvec3 LuaTranslation::position(const UpdateData& data) const {
"LuaScale",
fmt::format(
"Script '{}' does not have a function 'translation'",
_luaScriptFile
_luaScriptFile.value()
)
);
return glm::dvec3(0.0);
@@ -119,7 +119,7 @@ glm::dvec3 LuaTranslation::position(const UpdateData& data) const {
double values[3];
for (int i = 1; i <= 3; ++i) {
values[i] = ghoul::lua::value<double>(_state, i);
values[i - 1] = ghoul::lua::value<double>(_state, i);
}
return glm::make_vec3(values);

View File

@@ -49,14 +49,11 @@ documentation::Documentation StaticTranslation::Documentation() {
}
StaticTranslation::StaticTranslation()
: _position(
PositionInfo,
glm::dvec3(0.0),
glm::dvec3(-std::numeric_limits<double>::max()),
glm::dvec3(std::numeric_limits<double>::max())
)
: _position(PositionInfo, glm::dvec3(0.0), glm::dvec3(-1e35), glm::dvec3(1e35))
{
_position.setExponent(20.f);
// @TODO (2021-06-24, emmbr) The exponential sliders do not handle ranges with
// negative values very well. When they do, this line can be uncommented
//_position.setExponent(20.f);
addProperty(_position);
_position.onChange([this]() {

View File

@@ -970,7 +970,7 @@ void RenderableBillboardsCloud::update(const UpdateData&) {
_spriteTexture = DigitalUniverseModule::TextureManager.request(
std::to_string(hash),
[path = _spriteTexturePath]() -> std::unique_ptr<ghoul::opengl::Texture> {
LINFO(fmt::format("Loaded texture from '{}'", absPath(path)));
LINFO(fmt::format("Loaded texture from {}", absPath(path)));
std::unique_ptr<ghoul::opengl::Texture> t =
ghoul::io::TextureReader::ref().loadTexture(absPath(path).string());
t->uploadTexture();

View File

@@ -484,7 +484,7 @@ void RenderableDUMeshes::update(const UpdateData&) {
bool RenderableDUMeshes::loadData() {
bool success = false;
if (_hasSpeckFile) {
LINFO(fmt::format("Loading Speck file '{}'", _speckFile));
LINFO(fmt::format("Loading Speck file {}", std::filesystem::path(_speckFile)));
success = readSpeckFile();
if (!success) {
return false;
@@ -502,7 +502,9 @@ bool RenderableDUMeshes::loadData() {
bool RenderableDUMeshes::readSpeckFile() {
std::ifstream file(_speckFile);
if (!file.good()) {
LERROR(fmt::format("Failed to open Speck file '{}'", _speckFile));
LERROR(fmt::format(
"Failed to open Speck file {}", std::filesystem::path(_speckFile)
));
return false;
}
@@ -513,7 +515,6 @@ bool RenderableDUMeshes::readSpeckFile() {
// (signaled by the keywords 'datavar', 'texturevar', and 'texture')
std::string line;
while (true) {
std::streampos position = file.tellg();
std::getline(file, line);
if (file.eof()) {
@@ -532,16 +533,9 @@ bool RenderableDUMeshes::readSpeckFile() {
std::size_t found = line.find("mesh");
if (found == std::string::npos) {
//if (line.substr(0, 4) != "mesh") {
// we read a line that doesn't belong to the header, so we have to jump back
// before the beginning of the current line
//file.seekg(position);
//break;
continue;
}
else {
//if (line.substr(0, 4) == "mesh") {
// mesh lines are structured as follows:
// mesh -t texnum -c colorindex -s style {
// where textnum is the index of the texture;

View File

@@ -272,7 +272,7 @@ RenderablePlanesCloud::RenderablePlanesCloud(const ghoul::Dictionary& dictionary
addProperty(_opacity);
if (p.file.has_value()) {
_speckFile = absPath(*p.file).string();
_speckFile = absPath(*p.file);
_hasSpeckFile = true;
_drawElements.onChange([&]() { _hasSpeckFile = !_hasSpeckFile; });
addProperty(_drawElements);
@@ -320,7 +320,7 @@ RenderablePlanesCloud::RenderablePlanesCloud(const ghoul::Dictionary& dictionary
_scaleFactor.onChange([&]() { _dataIsDirty = true; });
if (p.labelFile.has_value()) {
_labelFile = absPath(*p.labelFile).string();
_labelFile = absPath(*p.labelFile);
_hasLabel = true;
_textColor = p.textColor.value_or(_textColor);
@@ -368,7 +368,7 @@ RenderablePlanesCloud::RenderablePlanesCloud(const ghoul::Dictionary& dictionary
}
}
_texturesPath = absPath(p.texturePath).string();
_texturesPath = absPath(p.texturePath);
_luminosityVar = p.luminosity.value_or(_luminosityVar);
_sluminosity = p.scaleLuminosity.value_or(_sluminosity);
@@ -403,7 +403,7 @@ void RenderablePlanesCloud::initialize() {
}
if (!_labelFile.empty()) {
LINFO(fmt::format("Loading Label file '{}'", _labelFile));
LINFO(fmt::format("Loading Label file {}", _labelFile));
_labelset = speck::label::loadFileWithCache(_labelFile);
for (speck::Labelset::Entry& e : _labelset.entries) {
e.position = glm::vec3(_transformationMatrix * glm::dvec4(e.position, 1.0));
@@ -612,7 +612,7 @@ void RenderablePlanesCloud::update(const UpdateData&) {
void RenderablePlanesCloud::loadTextures() {
for (const speck::Dataset::Texture& tex : _dataset.textures) {
std::filesystem::path fullPath = absPath(_texturesPath + '/' + tex.file);
std::filesystem::path fullPath = absPath(_texturesPath.string() + '/' + tex.file);
std::filesystem::path pngPath = fullPath;
pngPath.replace_extension(".png");
@@ -634,7 +634,7 @@ void RenderablePlanesCloud::loadTextures() {
ghoul::io::TextureReader::ref().loadTexture(path.string());
if (t) {
LINFOC("RenderablePlanesCloud", fmt::format("Loaded texture '{}'", path));
LINFOC("RenderablePlanesCloud", fmt::format("Loaded texture {}", path));
t->uploadTexture();
t->setFilter(ghoul::opengl::Texture::FilterMode::LinearMipMap);
t->purgeFromRAM();

View File

@@ -34,10 +34,9 @@
#include <openspace/properties/scalar/floatproperty.h>
#include <openspace/properties/vector/vec2property.h>
#include <openspace/properties/vector/vec3property.h>
#include <ghoul/opengl/ghoul_gl.h>
#include <ghoul/opengl/uniformcache.h>
#include <filesystem>
#include <functional>
#include <unordered_map>
@@ -129,9 +128,9 @@ private:
std::unordered_map<int, std::string> _textureFileMap;
std::unordered_map<int, PlaneAggregate> _planesMap;
std::string _speckFile;
std::string _labelFile;
std::string _texturesPath;
std::filesystem::path _speckFile;
std::filesystem::path _labelFile;
std::filesystem::path _texturesPath;
std::string _luminosityVar;
Unit _unit = Parsec;

View File

@@ -135,7 +135,7 @@ RenderablePoints::RenderablePoints(const ghoul::Dictionary& dictionary)
addProperty(_opacity);
registerUpdateRenderBinFromOpacity();
_speckFile = absPath(p.file).string();
_speckFile = absPath(p.file);
if (p.unit.has_value()) {
switch (*p.unit) {
@@ -185,7 +185,7 @@ RenderablePoints::RenderablePoints(const ghoul::Dictionary& dictionary)
}
if (p.colorMap.has_value()) {
_colorMapFile = absPath(*p.colorMap).string();
_colorMapFile = absPath(*p.colorMap);
_hasColorMapFile = true;
}
@@ -347,9 +347,9 @@ void RenderablePoints::update(const UpdateData&) {
absPath(_spriteTexturePath).string()
);
if (_spriteTexture) {
LDEBUG(fmt::format(
"Loaded texture from '{}'",absPath(_spriteTexturePath)
));
LDEBUG(
fmt::format("Loaded texture from {}", absPath(_spriteTexturePath))
);
_spriteTexture->uploadTexture();
}
_spriteTexture->setFilter(
@@ -369,7 +369,7 @@ void RenderablePoints::readColorMapFile() {
std::ifstream file(_colorMapFile);
if (!file.good()) {
throw ghoul::RuntimeError(fmt::format(
"Failed to open Color Map file '{}'", _colorMapFile
"Failed to open Color Map file {}", _colorMapFile
));
}
@@ -396,7 +396,7 @@ void RenderablePoints::readColorMapFile() {
}
else if (file.eof()) {
throw ghoul::RuntimeError(fmt::format(
"Failed to load colors from Color Map file '{}'", _colorMapFile
"Failed to load colors from Color Map file {}", _colorMapFile
));
}
}

View File

@@ -35,6 +35,7 @@
#include <openspace/properties/vector/vec3property.h>
#include <ghoul/opengl/ghoul_gl.h>
#include <ghoul/opengl/uniformcache.h>
#include <filesystem>
namespace ghoul::filesystem { class File; }
@@ -95,8 +96,8 @@ private:
spriteTexture, hasColorMap
) _uniformCache;
std::string _speckFile;
std::string _colorMapFile;
std::filesystem::path _speckFile;
std::filesystem::path _colorMapFile;
Unit _unit = Parsec;

View File

@@ -65,8 +65,7 @@ glm::vec3 computeStarColor(float bv) {
if (!colorMap.good()) {
LERROR(fmt::format(
"Failed to open colormap data file: '{}'",
absPath(bvColormapPath)
"Failed to open colormap data file: {}", absPath(bvColormapPath)
));
return glm::vec3(0.f);
}

View File

@@ -158,8 +158,8 @@ void createExoplanetSystem(const std::string& starName) {
const glm::vec3 starPosInParsec = system.starData.position;
if (!isValidPosition(starPosInParsec)) {
LERROR(fmt::format(
"Insufficient data available for exoplanet system: '{}'. "
"Could not determine star position", starName
"Insufficient data available for exoplanet system: '{}'. Could not determine "
"star position", starName
));
return;
}

View File

@@ -1013,22 +1013,9 @@ void RenderableFieldlinesSequence::render(const RenderData& data, RendererTasks&
bool additiveBlending = false;
if (_pColorABlendEnabled) {
const auto renderer = global::renderEngine->rendererImplementation();
const bool usingFBufferRenderer = renderer ==
RenderEngine::RendererImplementation::Framebuffer;
const bool usingABufferRenderer = renderer ==
RenderEngine::RendererImplementation::ABuffer;
if (usingABufferRenderer) {
_shaderProgram->setUniform("usingAdditiveBlending", _pColorABlendEnabled);
}
additiveBlending = usingFBufferRenderer;
if (additiveBlending) {
glDepthMask(false);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
}
additiveBlending = true;
glDepthMask(false);
glBlendFunc(GL_SRC_ALPHA, GL_ONE);
}
glBindVertexArray(_vertexArrayObject);

View File

@@ -316,8 +316,10 @@ void prepareStateAndKameleonForExtras(ccmc::Kameleon* kameleon,
(str == TAsPOverRho || str == "T" || str == "t"))
)
{
LDEBUG("BATSRUS doesn't contain variable T for temperature. Trying to "
"calculate it using the ideal gas law: T = pressure/density");
LDEBUG(
"BATSRUS doesn't contain variable T for temperature. Trying to calculate "
"it using the ideal gas law: T = pressure/density"
);
constexpr const char* p = "p";
constexpr const char* r = "rho";
success = kameleon->doesVariableExist(p) && kameleon->loadVariable(p) &&
@@ -325,9 +327,7 @@ void prepareStateAndKameleonForExtras(ccmc::Kameleon* kameleon,
str = TAsPOverRho;
}
if (!success) {
LWARNING(fmt::format(
"Failed to load extra variable: '{}'. Ignoring", str
));
LWARNING(fmt::format("Failed to load extra variable: '{}'. Ignoring", str));
extraScalarVars.erase(extraScalarVars.begin() + i);
--i;
}

View File

@@ -25,6 +25,7 @@
#ifndef __OPENSPACE_MODULE_FITSFILEREADER___FITSFILEREADER___H__
#define __OPENSPACE_MODULE_FITSFILEREADER___FITSFILEREADER___H__
#include <filesystem>
#include <string>
#include <memory>
#include <mutex>
@@ -63,7 +64,7 @@ public:
~FitsFileReader();
template<typename T>
std::shared_ptr<ImageData<T>> readImage(const std::string& path);
std::shared_ptr<ImageData<T>> readImage(const std::filesystem::path& path);
template<typename T>
std::shared_ptr<std::unordered_map<std::string, T>> readHeader(
@@ -78,7 +79,7 @@ public:
* If no HDU index is given the current Extension HDU will be read from.
*/
template<typename T>
std::shared_ptr<TableData<T>> readTable(std::string& path,
std::shared_ptr<TableData<T>> readTable(const std::filesystem::path& path,
const std::vector<std::string>& columnNames, int startRow = 1, int endRow = 10,
int hduIdx = 1, bool readAll = false);
@@ -88,7 +89,7 @@ public:
* If additional columns are given by <code>filterColumnNames</code>, they will be
* read but it will slow doen the reading tremendously.
*/
std::vector<float> readFitsFile(std::string filePath, int& nValuesPerStar,
std::vector<float> readFitsFile(std::filesystem::path filePath, int& nValuesPerStar,
int firstRow, int lastRow, std::vector<std::string> filterColumnNames,
int multiplier = 1);
@@ -96,7 +97,8 @@ public:
* Reads a single SPECK file and returns a vector with <code>nRenderValues</code>
* per star. Reads data in pre-defined order based on AMNH's star data files.
*/
std::vector<float> readSpeckFile(const std::string& filePath, int& nRenderValues);
std::vector<float> readSpeckFile(const std::filesystem::path& filePath,
int& nRenderValues);
private:
std::unique_ptr<CCfits::FITS> _infile;

View File

@@ -69,7 +69,8 @@ bool FitsFileReader::isPrimaryHDU() {
}
template <typename T>
std::shared_ptr<ImageData<T>> FitsFileReader::readImage(const std::string& path) {
std::shared_ptr<ImageData<T>> FitsFileReader::readImage(const std::filesystem::path& path)
{
try {
_infile = std::make_unique<FITS>(path, Read, true);
// Primary HDU Object
@@ -136,7 +137,7 @@ std::shared_ptr<T> FitsFileReader::readHeaderValue(const std::string key) {
}
template<typename T>
std::shared_ptr<TableData<T>> FitsFileReader::readTable(std::string& path,
std::shared_ptr<TableData<T>> FitsFileReader::readTable(const std::filesystem::path& path,
const std::vector<std::string>& columnNames,
int startRow,
int endRow,
@@ -148,7 +149,7 @@ std::shared_ptr<TableData<T>> FitsFileReader::readTable(std::string& path,
std::lock_guard g(_mutex);
try {
_infile = std::make_unique<FITS>(path, Read, readAll);
_infile = std::make_unique<FITS>(path.string(), Read, readAll);
// Make sure FITS file is not a Primary HDU Object (aka an image).
if (!isPrimaryHDU()) {
@@ -191,8 +192,9 @@ std::shared_ptr<TableData<T>> FitsFileReader::readTable(std::string& path,
return nullptr;
}
std::vector<float> FitsFileReader::readFitsFile(std::string filePath, int& nValuesPerStar,
int firstRow, int lastRow,
std::vector<float> FitsFileReader::readFitsFile(std::filesystem::path filePath,
int& nValuesPerStar, int firstRow,
int lastRow,
std::vector<std::string> filterColumnNames,
int multiplier)
{
@@ -245,7 +247,7 @@ std::vector<float> FitsFileReader::readFitsFile(std::string filePath, int& nValu
);
if (!table) {
throw ghoul::RuntimeError(fmt::format("Failed to open Fits file '{}'", filePath));
throw ghoul::RuntimeError(fmt::format("Failed to open Fits file {}", filePath));
}
int nStars = table->readRows - firstRow + 1;
@@ -520,7 +522,7 @@ std::vector<float> FitsFileReader::readFitsFile(std::string filePath, int& nValu
return fullData;
}
std::vector<float> FitsFileReader::readSpeckFile(const std::string& filePath,
std::vector<float> FitsFileReader::readSpeckFile(const std::filesystem::path& filePath,
int& nRenderValues)
{
std::vector<float> fullData;
@@ -528,7 +530,7 @@ std::vector<float> FitsFileReader::readSpeckFile(const std::string& filePath,
std::ifstream fileStream(filePath);
if (!fileStream.good()) {
LERROR(fmt::format("Failed to open Speck file '{}'", filePath));
LERROR(fmt::format("Failed to open Speck file {}", filePath));
return fullData;
}

Some files were not shown because too many files have changed in this diff Show More