Feature/textured points (#3068)

* WIP: Start usign texture arrays instead of just a single texture

Now the texture array is sucessfully created, sent over and sampled on the GPU

* Include information about the texture format alpha channel and do a conversion

* Make one draw wcall per texture array

* Add scale to size mapping and move to a separate component

* WIP: Make single textures work again, with texture array

Although this breaks the polygon cloud..

* Also make the polygon cloud work again

* Refactor rendering code

* handle array layer seprately from texture coordinates

* Make sure use size mapping uniform is always set

Fixes point cloud disappearing when multi-textures points are enabled

* Add has value check to size mapping

* Fix indentation

* Make sure points are rendered even when no texture is used

* Clean up texture handling a bit and add comment about storage creation

* Add comment and temporary asset changes

* Clean up handling of color mode (number of colro channels)

* Make interpolated points work with new rendering code

* Refactor

* Bring back check for valid index for color and size data

* Make sure to check if the provided data file exists

* Fix full path ont showing in error message

* Refactor rendering code a bit

* Change how the multitexture setup is configured in the asset and add documentation

Separating made documentation a lot easier..

* Add a todo comment for future discussion

* Add settings for texture compression

* Preserve aspects ratio of rendered textures

* Restructure input parameters for texture details

* Simplify color mode - we decided to not support grayscale

* Add option to set "useAlpha" from asset

* Enable texture per default and fix aspect ratio problem when no texture is used

* tiny refactor

* Fix polygon rendering that broke when adding texture compression

* Remove color in polygon shader

The color would be applied twice in rendering

* Restructure textures code and prevent loading the same texture twice

* Better handling of extra texture parameter in speck files

That does not lead to limitations in using dashes in texture names

* Add some docs and communicate texture mode to the user

* Fix so that single texture can be changed during runtime

* Allow changing compression and usealpha during runtime

* Update texture storage allocation to something that works in older OpenGL versions

* Add a check that checks if we use more texture layers than allowed

* Even more robust check of texture line in speck file (allow extra whitespaces)

* Update data mapping to include texture information and clean up code a bit

* Error handling and prevent loading non-used textures in texture map

* Update some docs

* Small cleanup

* Add one more error message for fault texture map file format

* Remove test version of tully images dataset

* Small refactor

* Add example asset

* Update Ghoul - for larger uniform cache

* Purge texture from ram when we're done with it

* Cleanup (comments, ugly png check, etc)

* Apply suggestions from code review

Co-authored-by: Alexander Bock <alexander.bock@liu.se>

* Apply suggestions from code review

* Adress some more review comments and fix broken asset

* More code review fixes

* Read provided sizemapping parameter from asset

* Fix warnings from trying to shift 16 bit int 32 bits :)

* simplify datamapping hash string

* Update comment that was not 100% correct. The file names may be specified as relative paths to a folder

* Small update based on previous code review comments

* Fix multi textured points gui path not same as other points

* Update Folder description to reduce some confusion

* Apply suggestions from code review

Co-authored-by: Ylva Selling <ylva.selling@gmail.com>

* Prevent updates to polygon cloud texture during runtime

This lead to rendering problems.

* Add describing comments to data files

* Clarify why speck version is disabled per default

* Update and clarify confusing size mapping parameters

* Apply suggestions from code review

Co-authored-by: Ylva Selling <ylva.selling@gmail.com>

* Apply suggestions from code review

---------

Co-authored-by: Alexander Bock <alexander.bock@liu.se>
Co-authored-by: Ylva Selling <ylva.selling@gmail.com>
This commit is contained in:
Emma Broman
2024-03-19 13:17:25 +01:00
committed by GitHub
parent 534f92c485
commit f36868d1c4
42 changed files with 1561 additions and 416 deletions

View File

@@ -1,3 +1,6 @@
# A dummy dataset with an xyz position and some random data columns.
# The last two columns has data values with either missing values or
# Nan values (which can be handled seprately when color mapping)
x,y,z,a,b,normaldist_withMissing,number_withNan
13428000,26239000,45870000,-3.226548224,33.95773276,-0.357778948,29
14727000,45282000,10832000,45.05941924,-106.0395917,,29
1 x # A dummy dataset with an xyz position and some random data columns. y z a b normaldist_withMissing number_withNan
1 # A dummy dataset with an xyz position and some random data columns.
2 # The last two columns has data values with either missing values or
3 # Nan values (which can be handled seprately when color mapping)
4 x x,y,z,a,b,normaldist_withMissing,number_withNan y z a b normaldist_withMissing number_withNan
5 13428000 13428000,26239000,45870000,-3.226548224,33.95773276,-0.357778948,29 26239000 45870000 -3.226548224 33.95773276 -0.357778948 29
6 14727000 14727000,45282000,10832000,45.05941924,-106.0395917,,29 45282000 10832000 45.05941924 -106.0395917 29

View File

@@ -1,3 +1,12 @@
# A test dataset for interpolation, where the xyz positions expand outward in each
# time step. There are 10 points per timestep, as illustrated by the time column.
# There are also two columns that may be used for color mapping:
#
# static_value has values that are the same for the correpsonding point in each
# time step
# dynamic_value has values that change for every timestep. The change will be
# reflected in the interpolation
#
time,x,y,z,dynamic_value,static_value
0.0,675.0297905065192,1672.6820684730765,-124.14442820502654,1,1
0.0,9.0852354697237,1080.363474597831,266.4506394528842,3,3
1 time # A test dataset for interpolation, where the xyz positions expand outward in each x y z dynamic_value static_value
1 # A test dataset for interpolation, where the xyz positions expand outward in each
2 # time step. There are 10 points per timestep, as illustrated by the time column.
3 # There are also two columns that may be used for color mapping:
4 #
5 # static_value has values that are the same for the correpsonding point in each
6 # time step
7 # dynamic_value has values that change for every timestep. The change will be
8 # reflected in the interpolation
9 #
10 time time,x,y,z,dynamic_value,static_value x y z dynamic_value static_value
11 0.0 0.0,675.0297905065192,1672.6820684730765,-124.14442820502654,1,1 675.0297905065192 1672.6820684730765 -124.14442820502654 1 1
12 0.0 0.0,9.0852354697237,1080.363474597831,266.4506394528842,3,3 9.0852354697237 1080.363474597831 266.4506394528842 3 3

View File

@@ -1,3 +1,5 @@
# A test dataset for interpolation, where the xyz vary in a random walk patterin
# time step. There are 10 points per timestep, as illustrated by the time column
time,x,y,z
0.0,675.0297905065192,1672.6820684730765,-124.14442820502654
0.0,9.0852354697237,1080.363474597831,266.4506394528842
1 time # A test dataset for interpolation, where the xyz vary in a random walk patterin x y z
1 # A test dataset for interpolation, where the xyz vary in a random walk patterin
2 # time step. There are 10 points per timestep, as illustrated by the time column
3 time time,x,y,z x y z
4 0.0 0.0,675.0297905065192,1672.6820684730765,-124.14442820502654 675.0297905065192 1672.6820684730765 -124.14442820502654
5 0.0 0.0,9.0852354697237,1080.363474597831,266.4506394528842 9.0852354697237 1080.363474597831 266.4506394528842

View File

@@ -0,0 +1,9 @@
# A dummy dataset with an xyz position, some random values and an integer value to
# use for texturing the points
#
# The texture mapping from index to file is handled in another file
x,y,z,a,b,texture
13428000,26239000,45870000,-3.226548224,33.95773276,1
14727000,45282000,10832000,45.05941924,-106.0395917,0
24999000,28370000,19911000,-70.58906931,154.1851656,2
26539000,36165000,39582000,-13.3663358,71.79484733,3
1 # A dummy dataset with an xyz position, some random values and an integer value to
2 # use for texturing the points
3 #
4 # The texture mapping from index to file is handled in another file
5 x,y,z,a,b,texture
6 13428000,26239000,45870000,-3.226548224,33.95773276,1
7 14727000,45282000,10832000,45.05941924,-106.0395917,0
8 24999000,28370000,19911000,-70.58906931,154.1851656,2
9 26539000,36165000,39582000,-13.3663358,71.79484733,3

View File

@@ -0,0 +1,8 @@
# The texture map is a mapping between an index and the name of an image file.
# All the images should be located in the same folder, or the name need to be specified as a path relative
# to a specific folder
0 test3.jpg
1 test2.jpg
2 test.jpg
3 openspace-horiz-logo.png

View File

@@ -0,0 +1,16 @@
# A dummy dataset with an xyz position, some random values and an integer value to
# use for texturing the points
datavar 0 a
datavar 1 b
datavar 2 texture
texturevar 2 # The index of the data column that has the texture data
texture 0 test3.jpg
texture 1 test.jpg
texture 2 test.jpg
texture 3 openspace-horiz-logo.png
13428000 26239000 45870000 -3.226548224 33.95773276 0
14727000 45282000 10832000 45.05941924 -106.0395917 2
24999000 28370000 19911000 -70.58906931 154.1851656 3
26539000 36165000 39582000 -13.3663358 71.79484733 1

View File

@@ -0,0 +1,107 @@
-- CSV
local Test = {
Identifier = "TexturedPointCloudExample_CSV",
Renderable = {
Type = "RenderablePointCloud",
File = asset.resource("data/textured_csv/textured_points.csv"),
DataMapping = {
-- The name of the column in the CSV file that corresponds to the texture (should
-- be an integer)
TextureColumn = "texture",
-- A Texture mapping file that provides information about which value/index
-- corresponds to which texture file
TextureMapFile = asset.resource("data/textured_csv/texturemap.tmap")
},
Texture = {
-- Where to find the texture files (in this case, in the OpenSpace data folder)
Folder = openspace.absPath("${DATA}")
},
UseAdditiveBlending = false
},
GUI = {
Name = "Multi-Textured Points",
Path = "/Example/Point Clouds/Multi-Textured"
}
}
-- Interpolated
-- Multi-texturing works also for interpolated point clouds. Here we let the same
-- dataset as used above be interpreted as representing only two points, with a different
-- texture. Note that the textures will be set based on the first two data items and will
-- not be changed during interpolation
local Test_Interpolated = {
Identifier = "TexturedPointCloudExample_Interpolated",
Renderable = {
Type = "RenderableInterpolatedPoints",
File = asset.resource("data/textured_csv/textured_points.csv"),
NumberOfObjects = 2,
DataMapping = {
TextureColumn = "texture",
TextureMapFile = asset.resource("data/textured_csv/texturemap.tmap")
},
Texture = {
Folder = openspace.absPath("${DATA}")
},
UseAdditiveBlending = false
},
GUI = {
Name = "Multi-Textured Points (Interpolation)",
Path = "/Example/Point Clouds/Multi-Textured"
}
}
-- Speck file (allows storing all data in one single file, including the texture mapping)
-- Note that we disable this scene graph node per default here, as it shows the same
-- information as the CSV version
local Test_Speck = {
Identifier = "TexturedPointCloudExample_Speck",
Renderable = {
Type = "RenderablePointCloud",
Enabled = false,
-- When loading multi-texture information from a speck file, we do not need a
-- DataMapping entry - all information is in the file
File = asset.resource("data/textured_speck/textures_points.speck"),
Texture = {
-- However, we do still need to specify where the textures are located
Folder = openspace.absPath("${DATA}")
},
UseAdditiveBlending = false
},
GUI = {
Name = "Multi-Textured Points (Speck file)",
Path = "/Example/Point Clouds/Multi-Textured"
}
}
asset.onInitialize(function()
openspace.addSceneGraphNode(Test)
openspace.addSceneGraphNode(Test_Interpolated)
openspace.addSceneGraphNode(Test_Speck)
end)
asset.onDeinitialize(function()
openspace.removeSceneGraphNode(Test_Speck)
openspace.removeSceneGraphNode(Test_Interpolated)
openspace.removeSceneGraphNode(Test)
end)
asset.export(Test)
asset.export(Test_Interpolated)
asset.export(Test_Speck)
asset.meta = {
Name = "Multi-textured Points",
Version = "1.0",
Description = [[Example of point clouds where multiple textures are used for the points,
based on information in the dataset. The dataset may be either CSV or Speck format.
If CSV is used, additional information must be provided through the DataMapping: 1)
Which column in the dataset that corresponds to the texture, and a separate file that
maps that value to a texture file
]],
Author = "OpenSpace Team",
URL = "http://openspaceproject.com",
License = "MIT license"
}

View File

@@ -100,13 +100,19 @@ local FixedColor_ScaleBasedOnData = {
FixedColor = { 0.5, 0.5, 0.0 }
},
SizeSettings = {
-- The options for the columns that the points can be scaled by. The first
-- alternative is chosen per default
SizeMapping = { "number_withNan", "a" },
SizeMapping = {
-- The options for the columns that the points can be scaled by. The first
-- alternative in the list is chosen per default
ParameterOptions = { "a", "b" },
-- Specify which option we want to use for size mapping at start up. Here we
-- use the last of the provided options rather than the first one, which is
-- otherwise used by default
Parameter = "b"
},
-- Use a slightly smaller scale than above for the base size of the points
-- (will decide the size of the smallest point). That way, the points don't
-- become too big when scaled by the data parameter
ScaleExponent = 5
ScaleExponent = 5.0
}
},
GUI = {
@@ -131,11 +137,13 @@ local Textured = {
Renderable = {
Type = "RenderablePointCloud",
File = asset.resource("data/dummydata.csv"),
-- The path to the texture file. Here we use openspace.absPath so that we can use
-- the ${DATA} token to get the path to a texture in the "OpenSpace/data" folder,
-- but for a file at a relative location it would also work to use asset.resource,
-- like for the data file above
Texture = openspace.absPath("${DATA}/test3.jpg"),
Texture = {
-- The path to the texture file. Here we use openspace.absPath so that we can use
-- the ${DATA} token to get the path to a texture in the "OpenSpace/data" folder,
-- but for a file at a relative location it would also work to use asset.resource,
-- like for the data file above
File = openspace.absPath("${DATA}/test3.jpg"),
},
-- Disable additive blending, so that points will be rendered with their actual color
-- and overlapping points will be sorted by depth. This works best when the points
-- have an opacity of 1

View File

@@ -21,7 +21,9 @@ local Object = {
Opacity = 1.0,
File = speck .. "2dF.speck",
Unit = "Mpc",
Texture = textures .. "point3A.png",
Texture = {
File = textures .. "point3A.png",
},
Coloring = {
ColorMapping = {
File = speck .. "2dF.cmap",

View File

@@ -21,7 +21,9 @@ local Object = {
Opacity = 1.0,
File = speck .. "2MASS.speck",
Unit = "Mpc",
Texture = textures .. "point3A.png",
Texture = {
File = textures .. "point3A.png",
},
Coloring = {
FixedColor = { 1.0, 0.4, 0.2 },
ColorMapping = {

View File

@@ -21,7 +21,9 @@ local Object = {
Opacity = 1.0,
File = speck .. "6dF.speck",
Unit = "Mpc",
Texture = textures .. "point3A.png",
Texture = {
File = textures .. "point3A.png",
},
Coloring = {
FixedColor = { 1.0, 1.0, 0.0 },
ColorMapping = {

View File

@@ -25,6 +25,7 @@ local Object = {
Renderable = {
Type = "RenderablePointCloud",
Enabled = false,
File = speck .. "abell.speck",
Labels = {
File = speck .. "abell.label",
Opacity = 1.0,
@@ -39,8 +40,9 @@ local Object = {
FixedColor = { 1.0, 0.4, 0.2 },
--ColorMap = speck .. "abell.cmap", -- TODO: Decide whether to add
},
File = speck .. "abell.speck",
Texture = textures .. "point3A.png",
Texture = {
File = textures .. "point3A.png",
},
Unit = "Mpc",
TransformationMatrix = TransformMatrix,
SizeSettings = {

View File

@@ -18,6 +18,7 @@ local DeepSkyObjects = {
Renderable = {
Type = "RenderablePointCloud",
Enabled = false,
File = speck .. "dso.speck",
Labels = {
File = speck .. "dso.label",
Color = { 0.1, 0.4, 0.6 },
@@ -29,8 +30,9 @@ local DeepSkyObjects = {
Coloring = {
FixedColor = { 1.0, 1.0, 0.0 }
},
File = speck .. "dso.speck",
Texture = textures .. "point3.png",
Texture = {
File = textures .. "point3.png",
},
Unit = "pc",
--FadeInDistances = { 0.05, 1.0 }, -- Fade in value in the same unit as "Unit"
SizeSettings = {

View File

@@ -18,6 +18,7 @@ local Object = {
Renderable = {
Type = "RenderablePointCloud",
Enabled = false,
File = speck .. "dwarfs.speck",
Labels = {
File = speck .. "dwarfs.label",
Color = { 0.5, 0.1, 0.2 },
@@ -26,8 +27,9 @@ local Object = {
Unit = "pc"
},
Opacity = 1.0,
File = speck .. "dwarfs.speck",
Texture = textures .. "point3.png",
Texture = {
File = textures .. "point3.png",
},
Unit = "pc",
Coloring = {
FixedColor = { 0.4, 0.0, 0.1 },

View File

@@ -18,6 +18,7 @@ local Object = {
Renderable = {
Type = "RenderablePointCloud",
Enabled = false,
File = speck .. "expl.speck",
Labels = {
File = speck .. "expl.label",
Color = { 0.3, 0.3, 0.8 },
@@ -26,8 +27,9 @@ local Object = {
Unit = "pc"
},
Opacity = 1.0,
Texture = textures .. "target-blue.png",
File = speck .. "expl.speck",
Texture = {
File = textures .. "target-blue.png",
},
Unit = "pc",
SizeSettings = {
ScaleExponent = 16.9,

View File

@@ -21,7 +21,9 @@ local Object = {
Opacity = 0.99,
File = speck .. "exoplanet_candidates.speck",
Unit = "pc",
Texture = textures .. "halo.png",
Texture = {
File = textures .. "halo.png",
},
Coloring = {
FixedColor = { 1.0, 1.0, 0.0 }
},

View File

@@ -27,7 +27,9 @@ local Object = {
Enabled = false,
Opacity = 1.0,
File = HUDFSpeck .. "hudf.speck",
Texture = circle .. "circle.png",
Texture = {
File = circle .. "circle.png",
},
Coloring = {
ColorMapping = {
File = ColorMap .. "hudf.cmap",

View File

@@ -36,10 +36,11 @@ local Object = {
Opacity = 0.7,
File = speck .. "ob.speck",
Unit = "pc",
Texture = textures .. "point4.png",
PolygonSides = 7,
SizeSettings = {
SizeMapping = { "diameter" },
SizeMapping = {
ParameterOptions = { "diameter" }
},
ScaleExponent = 16.9,
MaxSize = 17,
EnableMaxSizeControl = true

View File

@@ -27,7 +27,9 @@ local Object = {
Enabled = true,
Opacity = 0.95,
File = speck .. "quasars.speck",
Texture = textures .. "point3A.png",
Texture = {
File = textures .. "point3A.png",
},
Unit = "Mpc",
Fading = {
FadeInDistances = { 1000.0, 10000.0 } -- Fade in value in the same unit as "Unit"

View File

@@ -30,7 +30,9 @@ local Object = {
}
}
},
Texture = textures .. "point3A.png",
Texture = {
File = textures .. "point3A.png",
},
Unit = "Mpc",
Fading = {
FadeInDistances = { 220.0, 650.0 } -- Fade in value in the same unit as "Unit"

View File

@@ -18,6 +18,7 @@ local Object = {
Renderable = {
Type = "RenderablePointCloud",
Enabled = false,
File = speck .. "superclust.speck",
Labels = {
Enabled = true,
File = speck .. "superclust.label",
@@ -28,8 +29,9 @@ local Object = {
},
DrawElements = false,
Opacity = 0.65,
File = speck .. "superclust.speck",
Texture = textures .. "point3A.png",
Texture = {
File = textures .. "point3A.png",
},
Unit = "Mpc",
SizeSettings = {
ScaleExponent = 23.1,

View File

@@ -35,7 +35,9 @@ local TullyGalaxies = {
},
Opacity = 0.99,
File = speck .. "tully.speck",
Texture = textures .. "point3A.png",
Texture = {
File = textures .. "point3A.png"
},
Coloring = {
FixedColor = { 1.0, 0.4, 0.2 },
ColorMapping = {

View File

@@ -34,6 +34,9 @@ namespace openspace::dataloader::csv {
Dataset loadCsvFile(std::filesystem::path path,
std::optional<DataMapping> specs = std::nullopt);
std::vector<Dataset::Texture> loadTextureMapFile(std::filesystem::path path,
const std::set<int>& texturesInData);
} // namespace openspace::dataloader
#endif // __OPENSPACE_CORE___CSVLOADER___H__

View File

@@ -41,10 +41,15 @@ struct DataMapping {
bool hasExcludeColumns() const;
bool isExcludeColumn(std::string_view column) const;
bool checkIfAllProvidedColumnsExist(const std::vector<std::string>& columns) const;
std::optional<std::string> xColumnName;
std::optional<std::string> yColumnName;
std::optional<std::string> zColumnName;
std::optional<std::string> nameColumn;
std::optional<std::string> textureColumn;
std::optional<std::filesystem::path> textureMap;
std::optional<float> missingDataValue;
@@ -72,6 +77,8 @@ bool isColumnZ(const std::string& c, const std::optional<DataMapping>& mapping);
bool isNameColumn(const std::string& c, const std::optional<DataMapping>& mapping);
bool isTextureColumn(const std::string& c, const std::optional<DataMapping>& mapping);
} // namespace openspace::dataloader
#endif // __OPENSPACE_CORE___DATAMAPPING___H__

View File

@@ -47,6 +47,7 @@ set(HEADER_FILES
rendering/pointcloud/renderableinterpolatedpoints.h
rendering/pointcloud/renderablepointcloud.h
rendering/pointcloud/renderablepolygoncloud.h
rendering/pointcloud/sizemappingcomponent.h
rendering/renderablecartesianaxes.h
rendering/renderabledisc.h
rendering/renderablelabel.h
@@ -109,6 +110,7 @@ set(SOURCE_FILES
rendering/pointcloud/renderableinterpolatedpoints.cpp
rendering/pointcloud/renderablepointcloud.cpp
rendering/pointcloud/renderablepolygoncloud.cpp
rendering/pointcloud/sizemappingcomponent.cpp
rendering/renderablecartesianaxes.cpp
rendering/renderabledisc.cpp
rendering/renderablelabel.cpp

View File

@@ -46,6 +46,7 @@
#include <modules/base/rendering/pointcloud/renderableinterpolatedpoints.h>
#include <modules/base/rendering/pointcloud/renderablepointcloud.h>
#include <modules/base/rendering/pointcloud/renderablepolygoncloud.h>
#include <modules/base/rendering/pointcloud/sizemappingcomponent.h>
#include <modules/base/rendering/renderablecartesianaxes.h>
#include <modules/base/rendering/renderabledisc.h>
#include <modules/base/rendering/renderablelabel.h>
@@ -251,6 +252,8 @@ std::vector<documentation::Documentation> BaseModule::documentations() const {
RenderableTrailOrbit::Documentation(),
RenderableTrailTrajectory::Documentation(),
SizeMappingComponent::Documentation(),
ScreenSpaceDashboard::Documentation(),
ScreenSpaceFramebuffer::Documentation(),
ScreenSpaceImageLocal::Documentation(),

View File

@@ -130,6 +130,10 @@ namespace {
// the first set of positions for the objects, the next N rows to the second set of
// positions, and so on. The number of objects in the dataset must be specified in the
// asset.
//
// MultiTexture:
// Note that if using multiple textures for the points based on values in the dataset,
// the used texture will be decided based on the first N set of points.
struct [[codegen::Dictionary(RenderableInterpolatedPoints)]] Parameters {
// The number of objects to read from the dataset. Every N:th datapoint will
// be interpreted as the same point, but at a different step in the interpolation
@@ -328,13 +332,12 @@ void RenderableInterpolatedPoints::deinitializeShaders() {
_program = nullptr;
}
void RenderableInterpolatedPoints::bindDataForPointRendering() {
RenderablePointCloud::bindDataForPointRendering();
void RenderableInterpolatedPoints::setExtraUniforms() {
float t0 = computeCurrentLowerValue();
float t = glm::clamp(_interpolation.value - t0, 0.f, 1.f);
_program->setUniform("interpolationValue", t);
_program->setUniform("useSpline", _interpolation.useSpline);
_program->setUniform("useSpline", useSplineInterpolation());
}
void RenderableInterpolatedPoints::preUpdate() {
@@ -346,103 +349,96 @@ void RenderableInterpolatedPoints::preUpdate() {
int RenderableInterpolatedPoints::nAttributesPerPoint() const {
int n = RenderablePointCloud::nAttributesPerPoint();
// Need twice as much information as the regular points
n *= 2;
if (_interpolation.useSpline) {
// Always at least three extra position values (xyz)
n += 3;
if (useSplineInterpolation()) {
// Use two more positions (xyz)
n += 2 * 3;
}
// And potentially some more color and size data
n += _hasColorMapFile ? 1 : 0;
n += _hasDatavarSize ? 1 : 0;
return n;
}
std::vector<float> RenderableInterpolatedPoints::createDataSlice() {
ZoneScoped;
bool RenderableInterpolatedPoints::useSplineInterpolation() const {
return _interpolation.useSpline && _interpolation.nSteps > 1;
}
if (_dataset.entries.empty()) {
return std::vector<float>();
void RenderableInterpolatedPoints::addPositionDataForPoint(unsigned int index,
std::vector<float>& result,
double& maxRadius) const
{
using namespace dataloader;
auto [firstIndex, secondIndex] = interpolationIndices(index);
const Dataset::Entry& e0 = _dataset.entries[firstIndex];
const Dataset::Entry& e1 = _dataset.entries[secondIndex];
glm::dvec3 position0 = transformedPosition(e0);
glm::dvec3 position1 = transformedPosition(e1);
const double r = glm::max(glm::length(position0), glm::length(position1));
maxRadius = glm::max(maxRadius, r);
for (int j = 0; j < 3; ++j) {
result.push_back(static_cast<float>(position0[j]));
}
std::vector<float> result;
result.reserve(nAttributesPerPoint() * _nDataPoints);
for (int j = 0; j < 3; ++j) {
result.push_back(static_cast<float>(position1[j]));
}
// Find the information we need for the interpolation and to identify the points,
// and make sure these result in valid indices in all cases
float t0 = computeCurrentLowerValue();
float t1 = t0 + 1.f;
t1 = glm::clamp(t1, 0.f, _interpolation.value.maxValue());
unsigned int t0Index = static_cast<unsigned int>(t0);
unsigned int t1Index = static_cast<unsigned int>(t1);
if (useSplineInterpolation()) {
// Compute the extra positions, before and after the other ones. But make sure
// we do not overflow the allowed bound for the current interpolation step
int beforeIndex = glm::max(static_cast<int>(firstIndex - _nDataPoints), 0);
int maxT = static_cast<int>(_interpolation.value.maxValue() - 1.f);
int maxAllowedindex = maxT * _nDataPoints + index;
int afterIndex = glm::min(
static_cast<int>(secondIndex + _nDataPoints),
maxAllowedindex
);
const Dataset::Entry& e00 = _dataset.entries[beforeIndex];
const Dataset::Entry& e11 = _dataset.entries[afterIndex];
glm::dvec3 positionBefore = transformedPosition(e00);
glm::dvec3 positionAfter = transformedPosition(e11);
for (int j = 0; j < 3; ++j) {
result.push_back(static_cast<float>(positionBefore[j]));
}
for (int j = 0; j < 3; ++j) {
result.push_back(static_cast<float>(positionAfter[j]));
}
}
}
void RenderableInterpolatedPoints::addColorAndSizeDataForPoint(unsigned int index,
std::vector<float>& result) const
{
using namespace dataloader;
auto [firstIndex, secondIndex] = interpolationIndices(index);
const Dataset::Entry& e0 = _dataset.entries[firstIndex];
const Dataset::Entry& e1 = _dataset.entries[secondIndex];
// What datavar is in use for the index color
int colorParamIndex = currentColorParameterIndex();
// What datavar is in use for the size scaling (if present)
int sizeParamIndex = currentSizeParameterIndex();
double maxRadius = 0.0;
for (unsigned int i = 0; i < _nDataPoints; i++) {
using namespace dataloader;
const Dataset::Entry& e0 = _dataset.entries[t0Index * _nDataPoints + i];
const Dataset::Entry& e1 = _dataset.entries[t1Index * _nDataPoints + i];
glm::dvec3 position0 = transformedPosition(e0);
glm::dvec3 position1 = transformedPosition(e1);
const double r = glm::max(glm::length(position0), glm::length(position1));
maxRadius = glm::max(maxRadius, r);
// Positions
for (int j = 0; j < 3; j++) {
result.push_back(static_cast<float>(position0[j]));
}
for (int j = 0; j < 3; j++) {
result.push_back(static_cast<float>(position1[j]));
}
if (_interpolation.useSpline && _interpolation.nSteps > 1) {
// Compute the extra positions, before and after the other ones
unsigned int beforeIndex = static_cast<unsigned int>(
glm::max(t0 - 1.f, 0.f)
);
unsigned int afterIndex = static_cast<unsigned int>(
glm::min(t1 + 1.f, _interpolation.value.maxValue() - 1.f)
);
const Dataset::Entry& e00 = _dataset.entries[beforeIndex * _nDataPoints + i];
const Dataset::Entry& e11 = _dataset.entries[afterIndex * _nDataPoints + i];
glm::dvec3 positionBefore = transformedPosition(e00);
glm::dvec3 positionAfter = transformedPosition(e11);
for (int j = 0; j < 3; j++) {
result.push_back(static_cast<float>(positionBefore[j]));
}
for (int j = 0; j < 3; j++) {
result.push_back(static_cast<float>(positionAfter[j]));
}
}
// Colors
if (_hasColorMapFile) {
result.push_back(e0.data[colorParamIndex]);
result.push_back(e1.data[colorParamIndex]);
}
// Size data
if (_hasDatavarSize) {
// @TODO: Consider more detailed control over the scaling. Currently the value
// is multiplied with the value as is. Should have similar mapping properties
// as the color mapping
result.push_back(e0.data[sizeParamIndex]);
result.push_back(e1.data[sizeParamIndex]);
}
// @TODO: Also need to update label positions, if we have created labels from the dataset
// And make sure these are created from only the first set of points..
if (_hasColorMapFile && colorParamIndex >= 0) {
result.push_back(e0.data[colorParamIndex]);
result.push_back(e1.data[colorParamIndex]);
}
int sizeParamIndex = currentSizeParameterIndex();
if (_hasDatavarSize && sizeParamIndex >= 0) {
// @TODO: Consider more detailed control over the scaling. Currently the value
// is multiplied with the value as is. Should have similar mapping properties
// as the color mapping
result.push_back(e0.data[sizeParamIndex]);
result.push_back(e1.data[sizeParamIndex]);
}
setBoundingSphere(maxRadius);
return result;
}
void RenderableInterpolatedPoints::initializeBufferData() {
@@ -463,40 +459,28 @@ void RenderableInterpolatedPoints::initializeBufferData() {
glBindBuffer(GL_ARRAY_BUFFER, _vbo);
glBufferData(GL_ARRAY_BUFFER, bufferSize, nullptr, GL_DYNAMIC_DRAW);
int attributeOffset = 0;
int offset = 0;
auto addFloatAttribute = [&](const std::string& name, GLint nValues) {
GLint attrib = _program->attributeLocation(name);
glEnableVertexAttribArray(attrib);
glVertexAttribPointer(
attrib,
nValues,
GL_FLOAT,
GL_FALSE,
attibutesPerPoint * sizeof(float),
(attributeOffset > 0) ?
reinterpret_cast<void*>(attributeOffset * sizeof(float)) :
nullptr
);
attributeOffset += nValues;
};
offset = bufferVertexAttribute("in_position0", 3, attibutesPerPoint, offset);
offset = bufferVertexAttribute("in_position1", 3, attibutesPerPoint, offset);
addFloatAttribute("in_position0", 3);
addFloatAttribute("in_position1", 3);
if (_interpolation.useSpline) {
addFloatAttribute("in_position_before", 3);
addFloatAttribute("in_position_after", 3);
if (useSplineInterpolation()) {
offset = bufferVertexAttribute("in_position_before", 3, attibutesPerPoint, offset);
offset = bufferVertexAttribute("in_position_after", 3, attibutesPerPoint, offset);
}
if (_hasColorMapFile) {
addFloatAttribute("in_colorParameter0", 1);
addFloatAttribute("in_colorParameter1", 1);
offset = bufferVertexAttribute("in_colorParameter0", 1, attibutesPerPoint, offset);
offset = bufferVertexAttribute("in_colorParameter1", 1, attibutesPerPoint, offset);
}
if (_hasDatavarSize) {
addFloatAttribute("in_scalingParameter0", 1);
addFloatAttribute("in_scalingParameter1", 1);
offset = bufferVertexAttribute("in_scalingParameter0", 1, attibutesPerPoint, offset);
offset = bufferVertexAttribute("in_scalingParameter1", 1, attibutesPerPoint, offset);
}
if (_hasSpriteTexture) {
offset = bufferVertexAttribute("in_textureLayer", 1, attibutesPerPoint, offset);
}
glBindVertexArray(0);
@@ -541,4 +525,25 @@ float RenderableInterpolatedPoints::computeCurrentLowerValue() const {
return t0;
}
float RenderableInterpolatedPoints::computeCurrentUpperValue() const {
float t0 = computeCurrentLowerValue();
float t1 = t0 + 1.f;
t1 = glm::clamp(t1, 0.f, _interpolation.value.maxValue());
return t1;
}
std::pair<size_t, size_t>
RenderableInterpolatedPoints::interpolationIndices(unsigned int index) const
{
float t0 = computeCurrentLowerValue();
float t1 = computeCurrentUpperValue();
unsigned int t0Index = static_cast<unsigned int>(t0);
unsigned int t1Index = static_cast<unsigned int>(t1);
size_t lower = size_t(t0Index * _nDataPoints + index);
size_t upper = size_t(t1Index * _nDataPoints + index);
return { lower, upper };
}
} // namespace openspace

View File

@@ -52,20 +52,34 @@ public:
protected:
void initializeShadersAndGlExtras() override;
void deinitializeShaders() override;
void bindDataForPointRendering() override;
void setExtraUniforms() override;
void preUpdate() override;
int nAttributesPerPoint() const override;
bool useSplineInterpolation() const;
/**
* Create the data slice to use for rendering the points. Compared to the regular
* point cloud, the data slice for an interpolated set of points will have to be
* recreated when the interpolation value changes, and will only include a subset of
* the points in the entire dataset
* Create the rendering data for the positions for the point with the given index
* and append that to the result. Compared to the base class, this class may require
* 2-4 positions, depending on if * spline interpolation is used or not.
*
* \return The dataslice to use for rendering the points
* The values are computed based on the current interpolation value.
*
* Also, compute the maxRadius to use for setting the bounding sphere.
*/
std::vector<float> createDataSlice() override;
void addPositionDataForPoint(unsigned int index, std::vector<float>& result,
double& maxRadius) const override;
/**
* Create the rendering data for the color and size data for the point with the given
* index and append that to the result. Compared to the base class, this class require
* 2 values per data value, to use for interpolation.
*
* The values are computed based on the current interpolation value.
*/
void addColorAndSizeDataForPoint(unsigned int index,
std::vector<float>& result) const override;
void initializeBufferData();
void updateBufferData() override;
@@ -73,6 +87,8 @@ protected:
private:
bool isAtKnot() const;
float computeCurrentLowerValue() const;
float computeCurrentUpperValue() const;
std::pair<size_t, size_t> interpolationIndices(unsigned int index) const;
struct Interpolation : public properties::PropertyOwner {
Interpolation();

File diff suppressed because it is too large Load Diff

View File

@@ -27,6 +27,7 @@
#include <openspace/rendering/renderable.h>
#include <modules/base/rendering/pointcloud/sizemappingcomponent.h>
#include <openspace/properties/optionproperty.h>
#include <openspace/properties/stringproperty.h>
#include <openspace/properties/triggerproperty.h>
@@ -52,6 +53,16 @@ namespace openspace {
namespace documentation { struct Documentation; }
struct TextureFormat {
glm::uvec2 resolution;
bool useAlpha = false;
friend bool operator==(const TextureFormat& l, const TextureFormat& r);
};
struct TextureFormatHash {
size_t operator()(const TextureFormat& k) const;
};
/**
* This class describes a point cloud renderable that can be used to draw billboraded
* points based on a data file with 3D positions. Alternatively the points can also
@@ -74,15 +85,30 @@ public:
static documentation::Documentation Documentation();
protected:
enum class TextureInputMode {
Single = 0,
Multi,
Other // For subclasses that need to handle their own texture
};
virtual void initializeShadersAndGlExtras();
virtual void deinitializeShaders();
virtual void bindDataForPointRendering();
virtual void setExtraUniforms();
virtual void preUpdate();
glm::dvec3 transformedPosition(const dataloader::Dataset::Entry& e) const;
virtual int nAttributesPerPoint() const;
/**
* Helper function to buffer the vertex attribute with the given name and number
* of values. Assumes that the value is a float value.
*
* Returns the updated offset after this attribute is added
*/
int bufferVertexAttribute(const std::string& name, GLint nValues,
int nAttributesPerPoint, int offset) const;
virtual void updateBufferData();
void updateSpriteTexture();
@@ -91,17 +117,42 @@ protected:
/// Find the index of the currently chosen size parameter in the dataset
int currentSizeParameterIndex() const;
virtual std::vector<float> createDataSlice();
virtual void addPositionDataForPoint(unsigned int index, std::vector<float>& result,
double& maxRadius) const;
virtual void addColorAndSizeDataForPoint(unsigned int index,
std::vector<float>& result) const;
virtual void bindTextureForRendering() const;
std::vector<float> createDataSlice();
/**
* A function that subclasses could override to initialize their own textures to
* use for rendering, when the `_textureMode` is set to Other
*/
virtual void initializeCustomTexture();
void initializeSingleTexture();
void initializeMultiTextures();
void clearTextureDataStructures();
void loadTexture(const std::filesystem::path& path, int index);
void initAndAllocateTextureArray(unsigned int textureId,
glm::uvec2 resolution, size_t nLayers, bool useAlpha);
void fillAndUploadTextureLayer(unsigned int arrayindex, unsigned int layer,
size_t textureIndex, glm::uvec2 resolution, bool useAlpha, const void* pixelData);
void generateArrayTextures();
float computeDistanceFadeValue(const RenderData& data) const;
void renderBillboards(const RenderData& data, const glm::dmat4& modelMatrix,
const glm::dvec3& orthoRight, const glm::dvec3& orthoUp, float fadeInVariable);
gl::GLenum internalGlFormat(bool useAlpha) const;
ghoul::opengl::Texture::Format glFormat(bool useAlpha) const;
bool _dataIsDirty = true;
bool _spriteTextureIsDirty = true;
bool _spriteTextureIsDirty = false;
bool _cmapIsDirty = true;
bool _hasSpriteTexture = false;
@@ -113,12 +164,7 @@ protected:
struct SizeSettings : properties::PropertyOwner {
explicit SizeSettings(const ghoul::Dictionary& dictionary);
struct SizeMapping : properties::PropertyOwner {
SizeMapping();
properties::BoolProperty enabled;
properties::OptionProperty parameterOption;
};
SizeMapping sizeMapping;
std::unique_ptr<SizeMappingComponent> sizeMapping;
properties::FloatProperty scaleExponent;
properties::FloatProperty scaleFactor;
@@ -146,9 +192,6 @@ protected:
};
Fading _fading;
properties::BoolProperty _useSpriteTexture;
properties::StringProperty _spriteTexturePath;
properties::BoolProperty _useAdditiveBlending;
properties::BoolProperty _drawElements;
@@ -156,7 +199,18 @@ protected:
properties::UIntProperty _nDataPoints;
ghoul::opengl::Texture* _spriteTexture = nullptr;
struct Texture : properties::PropertyOwner {
Texture();
properties::BoolProperty enabled;
properties::BoolProperty allowCompression;
properties::BoolProperty useAlphaChannel;
properties::StringProperty spriteTexturePath;
properties::StringProperty inputMode;
};
Texture _texture;
TextureInputMode _textureMode = TextureInputMode::Single;
std::filesystem::path _texturesDirectory;
ghoul::opengl::ProgramObject* _program = nullptr;
UniformCache(
@@ -165,7 +219,8 @@ protected:
right, fadeInValue, hasSpriteTexture, spriteTexture, useColormap, colorMapTexture,
cmapRangeMin, cmapRangeMax, nanColor, useNanColor, hideOutsideRange,
enableMaxSizeControl, aboveRangeColor, useAboveRangeColor, belowRangeColor,
useBelowRangeColor, hasDvarScaling, enableOutline, outlineColor, outlineWeight
useBelowRangeColor, hasDvarScaling, dvarScaleFactor, enableOutline, outlineColor,
outlineWeight, aspectRatioScale
) _uniformCache;
std::string _dataFile;
@@ -181,6 +236,33 @@ protected:
GLuint _vao = 0;
GLuint _vbo = 0;
// List of (unique) loaded textures. The other maps refer to the index in this vector
std::vector<std::unique_ptr<ghoul::opengl::Texture>> _textures;
std::unordered_map<std::string, size_t> _textureNameToIndex;
// Texture index in dataset to index in vector of textures
std::unordered_map<int, size_t> _indexInDataToTextureIndex;
// Resolution/format to index in textures vector (used to generate one texture
// array per unique format)
std::unordered_map<TextureFormat, std::vector<size_t>, TextureFormatHash>
_textureMapByFormat;
// One per resolution above
struct TextureArrayInfo {
GLuint renderId;
GLint startOffset = -1;
int nPoints = -1;
glm::vec2 aspectRatioScale = glm::vec2(1.f);
};
std::vector<TextureArrayInfo> _textureArrays;
struct TextureId {
unsigned int arrayId;
unsigned int layer;
};
std::unordered_map<size_t, TextureId> _textureIndexToArrayMap;
};
} // namespace openspace

View File

@@ -38,8 +38,9 @@ namespace {
// A RenderablePolygonCloud is a RenderablePointCloud where the shape of the points
// is a uniform polygon with a given number of sides instead of a texture. For
// instance, PolygonSides = 5 results in the points being rendered as pentagons.
// Note that while this renderable inherits the texture property from
// RenderablePointCloud, any added texture value will be ignored in favor of the
//
// Note that while this renderable inherits the texture component from
// RenderablePointCloud, any added texture information will be ignored in favor of the
// polygon shape.
//
// See documentation of RenderablePointCloud for details on the other parts of the
@@ -70,15 +71,11 @@ RenderablePolygonCloud::RenderablePolygonCloud(const ghoul::Dictionary& dictiona
_nPolygonSides = p.polygonSides.value_or(_nPolygonSides);
// The texture to use for the rendering will be generated in initializeGl. Make sure
// we use it in the rnedering
// we use it in the rendering
_hasSpriteTexture = true;
}
void RenderablePolygonCloud::initializeGL() {
ZoneScoped;
RenderablePointCloud::initializeGL();
createPolygonTexture();
_textureMode = TextureInputMode::Other;
removePropertySubOwner(_texture);
}
void RenderablePolygonCloud::deinitializeGL() {
@@ -92,16 +89,24 @@ void RenderablePolygonCloud::deinitializeGL() {
RenderablePointCloud::deinitializeGL();
}
void RenderablePolygonCloud::bindTextureForRendering() const {
glBindTexture(GL_TEXTURE_2D, _pTexture);
}
void RenderablePolygonCloud::createPolygonTexture() {
void RenderablePolygonCloud::initializeCustomTexture() {
ZoneScoped;
if (_textureIsInitialized) {
LWARNING("RenderablePolygonCloud texture cannot be updated during runtime");
return;
}
LDEBUG("Creating Polygon Texture");
constexpr gl::GLsizei TexSize = 512;
// We don't use the helper function for the format and internal format here,
// as we don't want the compression to be used for the polygon texture and we
// always want alpha. This is also why we do not need to update the texture
bool useAlpha = true;
gl::GLenum format = gl::GLenum(glFormat(useAlpha));
gl::GLenum internalFormat = GL_RGBA8;
glGenTextures(1, &_pTexture);
glBindTexture(GL_TEXTURE_2D, _pTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
@@ -113,16 +118,35 @@ void RenderablePolygonCloud::createPolygonTexture() {
glTexImage2D(
GL_TEXTURE_2D,
0,
GL_RGBA8,
internalFormat,
TexSize,
TexSize,
0,
GL_RGBA,
GL_BYTE,
format,
GL_UNSIGNED_BYTE,
nullptr
);
renderToTexture(_pTexture, TexSize, TexSize);
// Download the data and use it to intialize the data we need to rendering.
// Allocate memory: N channels, with one byte each
constexpr unsigned int nChannels = 4;
unsigned int arraySize = TexSize * TexSize * nChannels;
std::vector<GLubyte> pixelData;
pixelData.resize(arraySize);
glBindTexture(GL_TEXTURE_2D, _pTexture);
glGetTexImage(GL_TEXTURE_2D, 0, format, GL_UNSIGNED_BYTE, pixelData.data());
// Create array from data, size and format
unsigned int id = 0;
glGenTextures(1, &id);
glBindTexture(GL_TEXTURE_2D_ARRAY, id);
initAndAllocateTextureArray(id, glm::uvec2(TexSize), 1, useAlpha);
fillAndUploadTextureLayer(0, 0, 0, glm::uvec2(TexSize), useAlpha, pixelData.data());
glBindTexture(GL_TEXTURE_2D_ARRAY, 0);
_textureIsInitialized = true;
}
void RenderablePolygonCloud::renderToTexture(GLuint textureToRenderTo,
@@ -191,7 +215,6 @@ void RenderablePolygonCloud::renderPolygonGeometry(GLuint vao) {
glClearBufferfv(GL_COLOR, 0, glm::value_ptr(Black));
program->setUniform("sides", _nPolygonSides);
program->setUniform("polygonColor", _colorSettings.pointColor);
glBindVertexArray(vao);
glDrawArrays(GL_POINTS, 0, 1);

View File

@@ -44,25 +44,24 @@ public:
explicit RenderablePolygonCloud(const ghoul::Dictionary& dictionary);
~RenderablePolygonCloud() override = default;
void initializeGL() override;
void deinitializeGL() override;
static documentation::Documentation Documentation();
private:
void createPolygonTexture();
void initializeCustomTexture() override;
void renderToTexture(GLuint textureToRenderTo, GLuint textureWidth,
GLuint textureHeight);
void renderPolygonGeometry(GLuint vao);
void bindTextureForRendering() const override;
int _nPolygonSides = 3;
GLuint _pTexture = 0;
GLuint _polygonVao = 0;
GLuint _polygonVbo = 0;
bool _textureIsInitialized = false;
};
} // namespace openspace

View File

@@ -0,0 +1,131 @@
/*****************************************************************************************
* *
* OpenSpace *
* *
* Copyright (c) 2014-2024 *
* *
* Permission is hereby granted, free of charge, to any person obtaining a copy of this *
* software and associated documentation files (the "Software"), to deal in the Software *
* without restriction, including without limitation the rights to use, copy, modify, *
* merge, publish, distribute, sublicense, and/or sell copies of the Software, and to *
* permit persons to whom the Software is furnished to do so, subject to the following *
* conditions: *
* *
* The above copyright notice and this permission notice shall be included in all copies *
* or substantial portions of the Software. *
* *
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, *
* INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A *
* PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT *
* HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF *
* CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE *
* OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. *
****************************************************************************************/
#include <modules/base/rendering/pointcloud/sizemappingcomponent.h>
#include <openspace/documentation/documentation.h>
#include <ghoul/logging/logmanager.h>
namespace {
constexpr std::string_view _loggerCat = "SizeMapping";
constexpr openspace::properties::Property::PropertyInfo EnabledInfo = {
"Enabled",
"Size Mapping Enabled",
"If this value is set to 'true' and at least one column was loaded as an option "
"for size mapping, the chosen data column will be used to scale the size of the "
"points. The first option in the list is selected per default.",
openspace::properties::Property::Visibility::NoviceUser
};
constexpr openspace::properties::Property::PropertyInfo OptionInfo = {
"Parameter",
"Parameter Option",
"This value determines which parameter is used for scaling of the point. The "
"parameter value will be used as a multiplicative factor to scale the size of "
"the points. Note that they may however still be scaled by max size adjustment "
"effects.",
openspace::properties::Property::Visibility::AdvancedUser
};
constexpr openspace::properties::Property::PropertyInfo ScaleFactorInfo = {
"ScaleFactor",
"Scale Factor",
"This value is a multiplicative factor that is applied to the data values that "
"are used to scale the points, when size mapping is applied.",
openspace::properties::Property::Visibility::AdvancedUser
};
struct [[codegen::Dictionary(SizeMappingComponent)]] Parameters {
// [[codegen::verbatim(EnabledInfo.description)]]
std::optional<bool> enabled;
// A list specifying all parameters that may be used for size mapping, i.e.
// scaling the points based on the provided data columns
std::optional<std::vector<std::string>> parameterOptions;
// [[codegen::verbatim(OptionInfo.description)]]
std::optional<std::string> parameter;
// [[codegen::verbatim(ScaleFactorInfo.description)]]
std::optional<float> scaleFactor;
};
#include "sizemappingcomponent_codegen.cpp"
} // namespace
namespace openspace {
documentation::Documentation SizeMappingComponent::Documentation() {
return codegen::doc<Parameters>("base_sizemappingcomponent");
}
SizeMappingComponent::SizeMappingComponent()
: properties::PropertyOwner({ "SizeMapping", "Size Mapping", "" })
, enabled(EnabledInfo, true)
, parameterOption(
OptionInfo,
properties::OptionProperty::DisplayType::Dropdown
)
, scaleFactor(ScaleFactorInfo, 1.f, 0.f, 1000.f)
{
addProperty(enabled);
addProperty(parameterOption);
addProperty(scaleFactor);
}
SizeMappingComponent::SizeMappingComponent(const ghoul::Dictionary& dictionary)
: SizeMappingComponent()
{
const Parameters p = codegen::bake<Parameters>(dictionary);
enabled = p.enabled.value_or(enabled);
int indexOfProvidedOption = -1;
if (p.parameterOptions.has_value()) {
std::vector<std::string> opts = *p.parameterOptions;
for (size_t i = 0; i < opts.size(); ++i) {
// Note that options are added in order
parameterOption.addOption(static_cast<int>(i), opts[i]);
if (p.parameter.has_value() && *p.parameter == opts[i]) {
indexOfProvidedOption = i;
}
}
}
if (indexOfProvidedOption >= 0) {
parameterOption = indexOfProvidedOption;
}
else if (p.parameter.has_value()) {
LERROR(fmt::format(
"Error when reading Parameter. Could not find provided parameter '{}' in "
"list of parameter options. Using default.", *p.parameter
));
}
scaleFactor = p.scaleFactor.value_or(scaleFactor);
}
} // namespace openspace

View File

@@ -0,0 +1,56 @@
/*****************************************************************************************
* *
* OpenSpace *
* *
* Copyright (c) 2014-2024 *
* *
* Permission is hereby granted, free of charge, to any person obtaining a copy of this *
* software and associated documentation files (the "Software"), to deal in the Software *
* without restriction, including without limitation the rights to use, copy, modify, *
* merge, publish, distribute, sublicense, and/or sell copies of the Software, and to *
* permit persons to whom the Software is furnished to do so, subject to the following *
* conditions: *
* *
* The above copyright notice and this permission notice shall be included in all copies *
* or substantial portions of the Software. *
* *
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, *
* INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A *
* PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT *
* HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF *
* CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE *
* OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. *
****************************************************************************************/
#ifndef __OPENSPACE_MODULE_BASE___SIZEMAPPINGCOMPONENT___H__
#define __OPENSPACE_MODULE_BASE___SIZEMAPPINGCOMPONENT___H__
#include <openspace/properties/propertyowner.h>
#include <openspace/properties/optionproperty.h>
#include <openspace/properties/scalar/boolproperty.h>
#include <openspace/properties/scalar/floatproperty.h>
namespace openspace {
namespace documentation { struct Documentation; }
/**
* This is a component that can be used to hold parameters and properties for scaling
* point cloud points (or other data-based entities) based a parameter in a dataset.
*/
struct SizeMappingComponent : public properties::PropertyOwner {
SizeMappingComponent();
explicit SizeMappingComponent(const ghoul::Dictionary& dictionary);
~SizeMappingComponent() override = default;
static documentation::Documentation Documentation();
properties::BoolProperty enabled;
properties::OptionProperty parameterOption;
properties::FloatProperty scaleFactor;
};
} // namespace openspace
#endif // __OPENSPACE_MODULE_BASE___SIZEMAPPINGCOMPONENT___H__

View File

@@ -28,6 +28,7 @@ flat in float gs_colorParameter;
flat in float vs_screenSpaceDepth;
flat in vec4 vs_positionViewSpace;
in vec2 texCoord;
flat in int layer;
uniform float opacity;
uniform vec3 color;
@@ -42,7 +43,7 @@ uniform vec4 belowRangeColor;
uniform bool useBelowRangeColor;
uniform bool hasSpriteTexture;
uniform sampler2D spriteTexture;
uniform sampler2DArray spriteTexture;
uniform bool useColorMap;
uniform sampler1D colorMapTexture;
@@ -85,23 +86,19 @@ Fragment getFragment() {
// Moving the origin to the center and calculating the length
float lengthFromCenter = length((texCoord - vec2(0.5)) * 2.0);
if (!hasSpriteTexture) {
if (lengthFromCenter > 1.0) {
discard;
}
if (!hasSpriteTexture && (lengthFromCenter > 1.0)) {
discard;
}
vec4 fullColor = vec4(1.0);
vec4 fullColor = glm::vec4(color, 1.0);
if (useColorMap) {
fullColor = sampleColorMap(gs_colorParameter);
}
else {
fullColor.rgb = color;
}
if (hasSpriteTexture) {
fullColor *= texture(spriteTexture, texCoord);
} else if (enableOutline && (lengthFromCenter > (1.0 - outlineWeight))) {
fullColor *= texture(spriteTexture, vec3(texCoord, layer));
}
else if (enableOutline && (lengthFromCenter > (1.0 - outlineWeight))) {
fullColor.rgb = outlineColor;
}

View File

@@ -27,12 +27,14 @@
#include "PowerScaling/powerScalingMath.hglsl"
layout(points) in;
flat in float textureLayer[];
flat in float colorParameter[];
flat in float scalingParameter[];
layout(triangle_strip, max_vertices = 4) out;
flat out float gs_colorParameter;
out vec2 texCoord;
flat out int layer;
flat out float vs_screenSpaceDepth;
flat out vec4 vs_positionViewSpace;
@@ -45,6 +47,7 @@ uniform dmat4 projectionMatrix;
uniform dmat4 modelMatrix;
uniform bool enableMaxSizeControl;
uniform bool hasDvarScaling;
uniform float dvarScaleFactor;
// RenderOption: CameraViewDirection
uniform vec3 up;
@@ -58,6 +61,8 @@ uniform vec3 cameraLookUp;
// The max size is an angle, in degrees, for the diameter
uniform float maxAngularSize;
uniform vec2 aspectRatioScale;
const vec2 corners[4] = vec2[4](
vec2(0.0, 0.0),
vec2(1.0, 0.0),
@@ -70,13 +75,14 @@ const int RenderOptionCameraPositionNormal = 1;
void main() {
vec4 pos = gl_in[0].gl_Position;
layer = int(textureLayer[0]);
gs_colorParameter = colorParameter[0];
dvec4 dpos = modelMatrix * dvec4(dvec3(pos.xyz), 1.0);
float scaleMultiply = pow(10.0, scaleExponent);
if (hasDvarScaling) {
scaleMultiply *= scalingParameter[0];
scaleMultiply *= scalingParameter[0] * dvarScaleFactor;
}
vec3 scaledRight = vec3(0.0);
@@ -116,9 +122,9 @@ void main() {
dmat4 cameraViewProjectionMatrix = projectionMatrix * cameraViewMatrix;
vec4 dposClip = vec4(cameraViewProjectionMatrix * dpos);
vec4 scaledRightClip = scaleFactor *
vec4 scaledRightClip = scaleFactor * aspectRatioScale.x *
vec4(cameraViewProjectionMatrix * dvec4(scaledRight, 0.0));
vec4 scaledUpClip = scaleFactor *
vec4 scaledUpClip = scaleFactor * aspectRatioScale.y *
vec4(cameraViewProjectionMatrix * dvec4(scaledUp, 0.0));
vec4 dposViewSpace= vec4(cameraViewMatrix * dpos);

View File

@@ -38,9 +38,12 @@ in float in_colorParameter1;
in float in_scalingParameter0;
in float in_scalingParameter1;
in float in_textureLayer;
uniform bool useSpline;
uniform float interpolationValue;
flat out float textureLayer;
flat out float colorParameter;
flat out float scalingParameter;
@@ -87,5 +90,7 @@ void main() {
);
}
textureLayer = in_textureLayer;
gl_Position = vec4(position, 1.0);
}

View File

@@ -27,13 +27,16 @@
#include "PowerScaling/powerScaling_vs.hglsl"
in vec3 in_position;
in float in_textureLayer;
in float in_colorParameter;
in float in_scalingParameter;
flat out float textureLayer;
flat out float colorParameter;
flat out float scalingParameter;
void main() {
textureLayer = in_textureLayer;
colorParameter = in_colorParameter;
scalingParameter = in_scalingParameter;
gl_Position = vec4(in_position, 1.0);

View File

@@ -26,9 +26,6 @@
out vec4 finalColor;
uniform vec3 polygonColor;
void main() {
finalColor = vec4(polygonColor, 1.0);
finalColor = vec4(1.0);
}

View File

@@ -33,11 +33,13 @@
#include <ghoul/logging/logmanager.h>
#include <ghoul/misc/assert.h>
#include <ghoul/misc/exception.h>
#include <ghoul/misc/stringhelper.h>
#include <algorithm>
#include <cmath>
#include <cctype>
#include <fstream>
#include <functional>
#include <sstream>
#include <string_view>
namespace {
@@ -94,6 +96,7 @@ Dataset loadCsvFile(std::filesystem::path filePath, std::optional<DataMapping> s
int yColumn = -1;
int zColumn = -1;
int nameColumn = -1;
int textureColumn = -1;
int nDataColumns = 0;
const bool hasExcludeColumns = specs.has_value() && specs->hasExcludeColumns();
@@ -119,11 +122,17 @@ Dataset loadCsvFile(std::filesystem::path filePath, std::optional<DataMapping> s
else if (isNameColumn(col, specs)) {
nameColumn = static_cast<int>(i);
}
else if (hasExcludeColumns && (*specs).isExcludeColumn(col)) {
else if (hasExcludeColumns && specs->isExcludeColumn(col)) {
skipColumns.push_back(i);
continue;
}
else {
// Note that the texture column is also a regular column. Just save the index
if (isTextureColumn(col, specs)) {
res.textureDataIndex = nDataColumns;
textureColumn = static_cast<int>(i);
}
res.variables.push_back({
.index = nDataColumns,
.name = col
@@ -132,6 +141,33 @@ Dataset loadCsvFile(std::filesystem::path filePath, std::optional<DataMapping> s
}
}
// Some errors / warnings
if (specs.has_value()) {
bool hasAllProvided = specs->checkIfAllProvidedColumnsExist(columns);
if (!hasAllProvided) {
LERROR(fmt::format(
"Error loading data file {}. Not all columns provided in data mapping "
"exists in dataset", filePath
));
}
}
bool hasProvidedTextureFile = specs.has_value() && specs->textureMap.has_value();
bool hasTextureIndex = (res.textureDataIndex >= 0);
if (hasProvidedTextureFile && !hasTextureIndex && !specs->textureColumn.has_value()) {
throw ghoul::RuntimeError(fmt::format(
"Error loading data file {}. No texture column was specified in the data "
"mapping", filePath
));
}
if (!hasProvidedTextureFile && hasTextureIndex) {
throw ghoul::RuntimeError(fmt::format(
"Error loading data file {}. Missing texture map file location in data "
"mapping", filePath
));
}
if (xColumn < 0 || yColumn < 0 || zColumn < 0) {
// One or more position columns weren't read
LERROR(fmt::format(
@@ -142,6 +178,8 @@ Dataset loadCsvFile(std::filesystem::path filePath, std::optional<DataMapping> s
LINFO(fmt::format("Loading {} rows with {} columns", rows.size(), columns.size()));
ProgressBar progress = ProgressBar(static_cast<int>(rows.size()));
std::set<int> uniqueTextureIndicesInData;
// Skip first row (column names)
for (size_t rowIdx = 1; rowIdx < rows.size(); ++rowIdx) {
const std::vector<std::string>& row = rows[rowIdx];
@@ -180,6 +218,10 @@ Dataset loadCsvFile(std::filesystem::path filePath, std::optional<DataMapping> s
else {
entry.data.push_back(value);
}
if (i == textureColumn) {
uniqueTextureIndicesInData.emplace(static_cast<int>(value));
}
}
const glm::vec3 positive = glm::abs(entry.position);
@@ -193,6 +235,82 @@ Dataset loadCsvFile(std::filesystem::path filePath, std::optional<DataMapping> s
progress.print(static_cast<int>(rowIdx + 1));
}
// Load the textures. Skip textures that are not included in the dataset
if (hasProvidedTextureFile) {
const std::filesystem::path path = *specs->textureMap;
if (!std::filesystem::is_regular_file(path)) {
throw ghoul::RuntimeError(fmt::format(
"Failed to open texture map file {}", path
));
}
res.textures = loadTextureMapFile(path, uniqueTextureIndicesInData);
}
return res;
}
std::vector<Dataset::Texture> loadTextureMapFile(std::filesystem::path path,
const std::set<int>& texturesInData)
{
ghoul_assert(std::filesystem::exists(path), "File must exist");
std::ifstream file(path);
if (!file.good()) {
throw ghoul::RuntimeError(fmt::format(
"Failed to open texture map file {}", path
));
}
int currentLineNumber = 0;
std::vector<Dataset::Texture> res;
std::string line;
while (std::getline(file, line)) {
ghoul::trimWhitespace(line);
currentLineNumber++;
if (line.empty() || line.starts_with("#")) {
continue;
}
std::vector<std::string> tokens = ghoul::tokenizeString(line, ' ');
int nNonEmptyTokens = static_cast<int>(std::count_if(
tokens.begin(),
tokens.end(),
[](const std::string& t) { return !t.empty(); }
));
if (nNonEmptyTokens > 2) {
throw ghoul::RuntimeError(fmt::format(
"Error loading texture map file {}: Line {} has too many parameters. "
"Expected 2: an integer index followed by a filename, where the file "
"name may not include whitespaces",
path, currentLineNumber
));
}
std::stringstream str(line);
// Each line is following the template:
// <idx> <file name>
Dataset::Texture texture;
str >> texture.index >> texture.file;
for (const Dataset::Texture& t : res) {
if (t.index == texture.index) {
throw ghoul::RuntimeError(fmt::format(
"Error loading texture map file {}: Texture index '{}' defined twice",
path, texture.index
));
}
}
if (texturesInData.contains(texture.index)) {
res.push_back(texture);
}
}
return res;
}

View File

@@ -31,6 +31,8 @@
#include <string_view>
namespace {
constexpr std::string_view _loggerCat = "RenderablePolygonCloud";
constexpr std::string_view DefaultX = "x";
constexpr std::string_view DefaultY = "y";
constexpr std::string_view DefaultZ = "z";
@@ -48,19 +50,19 @@ namespace {
if (mapping.has_value()) {
switch (columnCase) {
case PositionColumn::X:
column = (*mapping).xColumnName.value_or(column);
column = mapping->xColumnName.value_or(column);
break;
case PositionColumn::Y:
column = (*mapping).yColumnName.value_or(column);
column = mapping->yColumnName.value_or(column);
break;
case PositionColumn::Z:
column = (*mapping).zColumnName.value_or(column);
column = mapping->zColumnName.value_or(column);
break;
}
}
// Per default, allow both lower case and upper case versions of column names
if (!mapping.has_value() || !(*mapping).isCaseSensitive) {
if (!mapping.has_value() || !mapping->isCaseSensitive) {
column = ghoul::toLowerCase(column);
testColumn = ghoul::toLowerCase(testColumn);
}
@@ -68,6 +70,27 @@ namespace {
return testColumn == column;
}
bool isSameStringColumn(const std::string& left, const std::string& right,
bool isCaseSensitive)
{
std::string l = isCaseSensitive ? ghoul::toLowerCase(left) : left;
std::string r = isCaseSensitive ? ghoul::toLowerCase(right) : right;
return (l == r);
}
bool containsColumn(const std::string& c, const std::vector<std::string>& columns,
bool isCaseSensitive)
{
auto it = std::find_if(
columns.begin(),
columns.end(),
[&c, &isCaseSensitive](const std::string& col) {
return isSameStringColumn(c, col, isCaseSensitive);
}
);
return it != columns.end();
}
// This is a data mapping structure that can be used when creating point cloud
// datasets, e.g. from a CSV or Speck file.
//
@@ -92,6 +115,18 @@ namespace {
// files, where the name is given by the comment at the end of each line
std::optional<std::string> name;
// Specifies a column name for a column that has the data for which texture to
// use for each point (given as an integer index). If included, a texture map
// file need to be included as well
std::optional<std::string> textureColumn;
// A file where each line contains an integer index and an image file name.
// Not valid for SPECK files, which includes this information as part of its
// data format. This map will be used to map the data in the TextureColumn to
// an image file to use for rendering the points. Note that only the files with
// indices that are used in the dataset will actually be loaded
std::optional<std::filesystem::path> textureMapFile;
// Specifies whether to do case sensitive checks when reading column names.
// Default is not to, so that 'X' and 'x' are both valid column names for the
// x position column, for example
@@ -124,6 +159,8 @@ DataMapping DataMapping::createFromDictionary(const ghoul::Dictionary& dictionar
result.yColumnName = p.y;
result.zColumnName = p.z;
result.nameColumn = p.name;
result.textureColumn = p.textureColumn;
result.textureMap = p.textureMapFile;
result.missingDataValue = p.missingDataValue;
@@ -142,6 +179,28 @@ bool DataMapping::isExcludeColumn(std::string_view column) const {
return (found != excludeColumns.end());
}
bool DataMapping::checkIfAllProvidedColumnsExist(
const std::vector<std::string>& columns) const
{
auto checkColumnIsOk = [this, &columns](std::optional<std::string> col,
std::string_view key)
{
if (col.has_value() && !containsColumn(*col, columns, isCaseSensitive)) {
LWARNING(fmt::format("Could not find provided {} column: '{}'", key, *col));
return false;
}
return true;
};
bool hasAll = true;
hasAll &= checkColumnIsOk(xColumnName, "X");
hasAll &= checkColumnIsOk(yColumnName, "Y");
hasAll &= checkColumnIsOk(zColumnName, "Z");
hasAll &= checkColumnIsOk(nameColumn, "Name");
hasAll &= checkColumnIsOk(textureColumn, "Texture");
return hasAll;
}
std::string generateHashString(const DataMapping& dm) {
std::string a;
for (const std::string_view c : dm.excludeColumns) {
@@ -150,13 +209,14 @@ std::string generateHashString(const DataMapping& dm) {
unsigned int excludeColumnsHash = ghoul::hashCRC32(a);
return fmt::format(
"DM|x{}|y{}|z{}|name{}|m{}|{}|{}",
"DM|{}|{}|{}|{}|{}|{}|{}|{}",
dm.xColumnName.value_or(""),
dm.yColumnName.value_or(""),
dm.zColumnName.value_or(""),
dm.nameColumn.value_or(""),
dm.textureColumn.value_or(""),
dm.missingDataValue.has_value() ? ghoul::to_string(*dm.missingDataValue) : "",
dm.isCaseSensitive ? "1" : "0",
dm.isCaseSensitive ? 1 : 0,
excludeColumnsHash
);
}
@@ -181,14 +241,14 @@ bool isNameColumn(const std::string& c, const std::optional<DataMapping>& mappin
if (!mapping.has_value() || !mapping->nameColumn.has_value()) {
return false;
}
return isSameStringColumn(c, *mapping->nameColumn, mapping->isCaseSensitive);
}
std::string testColumn = c;
std::string mappedColumn = *mapping->nameColumn;
if (!mapping->isCaseSensitive) {
testColumn = ghoul::toLowerCase(testColumn);
mappedColumn = ghoul::toLowerCase(mappedColumn);
bool isTextureColumn(const std::string& c, const std::optional<DataMapping>& mapping) {
if (!mapping.has_value() || !mapping->textureColumn.has_value()) {
return false;
}
return testColumn == mappedColumn;
return isSameStringColumn(c, *mapping->textureColumn, mapping->isCaseSensitive);
}
} // namespace openspace::dataloader

View File

@@ -176,12 +176,27 @@ Dataset loadSpeckFile(std::filesystem::path path, std::optional<DataMapping> spe
// 2: texture 1 M1.sgi
// The parameter in #1 is currently being ignored
std::vector<std::string> tokens = ghoul::tokenizeString(line, ' ');
int nNonEmptyTokens = static_cast<int>(std::count_if(
tokens.begin(),
tokens.end(),
[](const std::string& t) { return !t.empty(); }
));
if (nNonEmptyTokens > 4) {
throw ghoul::RuntimeError(fmt::format(
"Error loading speck file {}: Too many arguments for texture on line {}",
path, currentLineNumber
));
}
bool hasExtraParameter = nNonEmptyTokens > 3;
std::stringstream str(line);
std::string dummy;
str >> dummy;
if (line.find('-') != std::string::npos) {
if (hasExtraParameter) {
str >> dummy;
}