To all SSC Station occupants
Thank you for the donations over the past year (2024), it is much appreciated. I am still trying to figure out how to migrate the forums to another community software (probably phpbb) but in the meantime I have updated the forum software to the latest version. SSC has been around a while so their is some very long time members here still using the site, thanks for making SSC home and sorry I haven't been as vocal as I should be in the forums I will try to improve my posting frequency.
Thank you again to all of the members that do take the time to donate a little, it helps keep this station functioning on the outer reaches of space.
-D1-
Very pretty!
Does this mean we'll soon have GPU generated terrains with only the collision done on CPU?
Also, what's Ae_?
You mean who!
hehe 🙂
More likely we'll reduce a lot of CPU noise, keep the barebones shapes and put the high detail stuff on the GPU, since it'll be next to impossible to generate the same terrains seperately on GPU and then on CPU for those without the capability. I suppose everything could be moved to GPU but then we have to leave lots of people behind, those without powerful GPUs...
You mean who!
I do? 😆
I was thinking that it would just be another option after the "very very high" setting, or alongside it. I say this because you can upgrade your GPU for a lot of machines, even AGP ones, but the CPU might be your limitation for the terrain.
The collision and city placement just seems to depend on the CPU sampling a small set of points on the terrain. That would still be done on CPU, but the actual rendered terrain doesn't need to be, it just needs to be passed a flat mesh, then updated on the GPU in the vertex shader.
Artlav pulled this off in Spaceway, if I'm not mistaken.
I dunno, I thought that was all CPU based... maybe textures are on GPU though? I don't recall seeing any options wrt GPU/CPU noise. IIRC the terrain updates slower than Pioneer when I did some testing although textures updated nice and quick, so that looks like CPU terrain and GPU textures to me, which we'll eventually have.
EDIT// Although I just saw this in the chagelog: -Ice ball type of planet is optimized and ported to GPU... So maybe your right.
So you generate the mesh on the GPU? Directly pass it the result from GetHeight()?
So you generate the mesh on the GPU? Directly pass it the result from GetHeight()?
Other way around 🙂 Ok a better explanation.
You're generating noise on the GPU now.
That's where you get the height from.
You render a mesh just like you usually would - with one difference; the mesh hasn't had any heights generated or set for it so it's just a grid that has been spread across a part of the surface of a sphere. Imagine instead of calling "GetHeight" on the CPU you just set the height to be 0.0f then you set the normal for the vertex to point straight "up" perpendicular to the surface.
When you render you pass this vertex value into your GPU noise implementation of "GetHeight".
Now you have a height value just do something like: "vertex = vertex + (height * normal)"
That will just move the vertex in the direction along the normal.
Things that we'll lose by using the above, normals, but we can rebuild them! We know this because we've seen other people do it 😀
Even if we don't know ourselves right now!
EDIT - I have a really simple example of this that uses textures to provide the height but it's for an older version of SFML and was just for a small sphere to look like a planet rather than the quadtree-cube that we use. I can post the Vertex shader code though. It used the alpha channel "color.z" to provide the height.
uniform float time;
uniform sampler2D myTexture;
void main()
{
gl_TexCoord[0] = gl_MultiTexCoord0;
lightDir = normalize(vec3(gl_LightSource[0].position));
normal = gl_NormalMatrix * gl_Normal;
vec4 color = texture2D(myTexture, vec2(gl_TexCoord[0]));
vec3 pos = gl_Vertex + (gl_Normal * color.z);
vec4 pos4 = vec4(pos.x, pos.y, pos.z, 1.0);
gl_Position = vec4(gl_ModelViewProjectionMatrix * pos4);
}
Nope, Spaceway runs all on GPU (or CPU, depending on your setup...). I don't know how he managed it, but the only difference he gets are ever so light color variations between the two modes. I remember that in an earlier version this was not yet so, but obviously he made some crazy stunt somewhere.
Also, wheather spaceway is faster or slower than Pioneer is entirely up to your rig. I have a middle to low end CPU, but a middle to high-end GPU, so spaceway generation is quite a lot faster on my machine than pioneer. I imagine if you have a good processor but not so strong a GPU, Pioneer would run faster, because spaceway is one hell of a snail when it has to fall back on CPU. and the GPU has to be pretty capable to not force it to fall back.
@ Fluffy
Do you really think you can generate an entire planet on a GPU? Thats a shit load of noise, my hardware begins to crumble after just 4 intersecting noise patterns with 3 octaves each, CPU terrains have perhaps 20 intersecting patterns from 5 to 16 octaves approx. My GPU gave single figures with just one noise pattern with 8 octaves..
But... This was animated noise 😉 So its not a fair test, but its the only test I have performed comparing the two.
See I don't think the gpu can do this on the fly, what I mean is it has to slow everything else down to do this, unlike the cpu which can do it all in the background when your flying around.. But if Im wrong about that and it can process this stuff in parrallel then it sounds like a good idea.
Ok Ill give it a try 🙂
It's not can we do it, it's more;
How come they can and we can't?
What are they doing differently?
Perhaps we can't do it the way I suggested, but if we can't then how do others?
EDIT: Looks like we mutually ninja'd ourselves! 😆
anyways, I'll leave the post as is, as it's not totally out of context.
err... why an entire planet? The sole purpouse of noise is that you don't have to have the whole planet in memory, at least not the whole planet at full detail. Noise gets aplied in areas that are close enough to the camera.
You might have a point with the octaves, though: Spaceway generates less fractal noise in its terrain than pioneer, but it more than makes up for it by its textures. And objects like rocks and trees... 😀
but yes, I use Spaceway as a texture generator for Orbiter Galaxy, and it's pretty fast. Like, 30 seconds on my GPU what takes over 5 minutes on my CPU. Spaceway is opensource too, by the way, so you could take a look at his code to see how he did it (it's in Pascall, though). Or you might just ask him, he's a pretty pleasant guy.
Ok Ill give it a try 🙂
After some checking as I was quite curious, Im convinced that it generates the terrain on the CPU even with that option selected, but textures all on GPU with the option selected. This is only from observation though, I notice Identical terrain and terrain generation rates with the option on or off, however textures are of a much higher quality and much quicker to update with the option turned on.
Prove me wrong though, I like it 🙂 ( I'll learn something new 😉 )
So anyway, if he indeed is generating it all on the GPU then this is very impressive indeed as there is no visual difference between that actual terrain (mesh).
Yeah but the calculations are pretty intensive, noise feeding into noise, to generate more noise 🙂 It all gets very slow. So yes your not doing an entire planet in one go, but your running the calculations for an entire planet all the time, which in their current form are too intense for a GPU, it can be optimised a lot but its still a *lot* of noise.
Hehe, maybe they are just better 😆 /jk
@s20dan
hah! better than you pffft 😆
😀
Ok, so it's not per-vertex/per-frame then, so it must be writing height values to texture and then sampling it almost exactly as I do in that shader.
Now that's more promising;
1) generate the height values into a texture,
2) set that as one of your textures when rendering a terrain patch,
3) each vertex will need a UV coordinate,
4) pull the height out of the texture and modify vertex position as before.
The normal can come from a normal map, either at the same resolution or something more coarse.
I am *pretty* sure it doesn't, since when I use it to generate full planetary textures, it does so based on the heightmap. But I'll ask him just to make sure. And mybe get a few Ideas for you guys. 🙂
Cool! 🙂
I got hold of the source too, although from the date it seems like quite an old version :March 2011
Not that I can do much with Pascal mind you hehe
Now you mention it I think your right yes, because I just remembered there was an older version of this terrain engine available for Orbiter called Orulex (By Artlav of course) And that used to work by generating a texture which served as the heightmap, the terrain was then built around that. But the heightmaps were generated by a seperate program outside of Orbiter, and you could edit them youself that kind of thing. It would make sense if he has simply updated that terrain engine for use here....
I think it's also how Infinity does it from these two articles: GPU Terrain generation, cell noise, rivers, craters & Craters and normal maps.
Thats simply amazing, ok we have a long way to go 😳
Rubbish! You've obviously not been looking at your own screenshots!
You already have code for craters and noise, all that's doing it setting it to "very very high" detail, and letting the GPU sort it out which is what you're looking at now.
It's awesome to see what you're doing with it right now!
ORULEX was a direct offspring of an early version of spaceway. It is CPU only. It could either read in custom heightmaps or produce completely random terrain based on perlin noise functions, which could be altered in a config file. I suppose Pioneer does it in a similiar way. But yes, as far as I know, the whole thing works on the basis of drawing a heightmap to texture, and then scaling up and resolving specific pixels in it as you aproach. I didn't even know there was another way to do it...?
Pioneer has gone a far longer way than infinity already. It is halfway playable and still in very active developement, which are two things infinity cannot brag with. Sure, the terrain engine looked nice, but all we ever saw of it was videos. Noone has ever seen it in action on his own machine. There's no telling what kind of resources that thing consumed to get that level of visual fidelity.
thanks 🙂
Ah you might know a bit about the problem I have at the moment with the animated fractals since you have delved into the Geosphere code 🙂
In the video you can't really see it, but not only does noise change with time it also changes with the direction the camera faces, thats using:
As the co-ordinates. So tnorm must change in relation to the direction the camera faces and not when the object rotates, ok I thought, Gl_Vertex must work...
However it seems each patch gest assigned its own co-ordinate space which does not take into consideration its place on the sphere, so the vertices for each patch are all calculated in relation to the center of each patch, which means every patch produces the same noise.
Any ideas where it does this? As I'm thinking I should be able to save the position of each patch and send that information to the shader too, in-fact it must be there in some form I just don't quite know how to get at it 😉
Edit// Ah ninjad again, I took too long to write this post 🙂
Yes it does seem very similar. For Earth we use a heightmap, we then pass the results into some noisefunctions to make it look nicer/more detailed, for the rest we just use perlin noise functions, whose data is determined by factors generated at sysgen, or it might be once they are within a certain range.
Yeah if you look in GeoSphere.cpp for the method "_UpdateVBOs()" it subtracts the clipCentroid from the position. That was the fix the precision problems we were having with the terrain jumping due to the GPU only supporting floats. Is that what you're looking for?
Strange that gl_TexCoord changes with the direction you're viewing it at though.
I wouldnt really know 🙂 I just thought logically they should not change, although there's some code in geosphere.frag shader that Tom wrote where he uses : vec3 eyepos = vec3(gl_TexCoord[0]);
"eyepos" that is exactly how it seems to react, but I have no prior experience with glsl and opengl so I have no idea if thats normal 🙂
Im just testing some things now, but it did seem to produce some glaring artifacts 😉 Ill post some screens once I've tried some different things out.