|
Author |
Thread Statistics | Show CCP posts - 18 post(s) |
|
CCP MC Peanut
C C P C C P Alliance
243
|
Posted - 2014.05.19 08:40:00 -
[1] - Quote
I noticed this post and since it is a little relevant to what I am currently engaged in at work, I just had to jump in. We are in the early stages of prototyping a new Starmap (woot). Like the HUD change, this Starmap is pretty much all in 3D. Primarily the devs in this 'early stage' are myself (technical artist) and, a new Engineer dev, CCP BaoZi, whom I'd like to welcome and introduce to the forums. CCP BaoZi has done the majority of initial work, and will be posting soon to give even more cool information than I can provide.
If you followed the HUD thread where I explained some of the technology behind the 3D HUD, we are taking a similar approach with the Starmap. If you don't mind, I can break down the current organization. Please note that this is just the prototype and nothing is set in stone yet:
Universe View The universe view has 1 mesh for each region/jump lines. We took the coordinates of the stars and jump lines and ran them through a python script in Maya that built the mesh. Breaking them up into mesh groups gives us much better performance than if they were all unique actors. Each region will have a few states that will be exposed to Kismet: default, unfocused, selected, and highlighted. It is likely we will add a few more as the needs arise. At the moment, when you select a region (by mouse click) the selected state will be entered (for the clicked one) and the rest will change to unfocused, where we can change their material setting to be less visible.
Region View When you focus (mouse click) on a region a 'higher resolution' star mesh (I mean a higher resolution sphere) is loaded at the appropriate coordinate. The stars have the same states that the regions have. These states are nice because when they change we can intercept the event in kismet and adjust the material (for a highlight--also done on regions). Also when the star enters the 'default' state we pass through the star's id, which we can use to determine the color.
Solar System View Once you focus on a star you go into the solar system view. Likely we will switch to a higher resolution star material (but currently just keeping things simple). Also planets are currently loading at this view, but for now, they all use the same material (it seems like the planet rendering technology already exists within the company, so our goal is to eventually utilize that).
Planet View When you select a planet, you go into planet view. The planet will change states, but currently nothing special is happening.
The cool thing is that the transition between the above views is completely seamless--no load screens. This is all in the same level. It is important to us that the navigation is fast and easy (congrats go to CCP BaoZi, not me). The goal is to keep it seamless, and right now it is, but we haven't added all the high resolution graphics content; it is something we are aiming to preserve, but we may reach a point where there is some compromise between visuals and seamless loading.
Mostly what I've been doing is prototyping visual effects (like when you scan for battles), setting up the initial content files and materials, and tuning visuals to a bare minimum use level. Another thing I'm pretty happy with is I set up a 'culling' section in the material that will set transparency on pixels between the camera and a given point in space (this point is set on the camera pivot point changed event). Doing this really reduces clutter in the foreground when you are flying around the universe. This is a perfect example of how leveraging code events (camera pivot changed) gives us flexibility to do cool things on the content side.
Going forward, obviously there is still a lot more to do--UI panels, (3d) icons, battle information, etc. Our hope is that we can continue to update our progress here on the forums. Feel free to ask questions (but there are likely a few things we can't answer) as well as provide any ideas or feedback.
|
|
|
CCP MC Peanut
C C P C C P Alliance
262
|
Posted - 2014.05.20 00:56:00 -
[2] - Quote
Lady MDK wrote:steadyhand amarr wrote:No shooter fan wants to spend 30 minutes to an hour going from system to system. In EvE you stay to local area long hauls are rare in dust you will be flying around all over the place Agreed. As I see it from what Mc was saying there is the potential to be able to look at New Eden and see the cluster and its regions and systems with how many sites in each and work your way down to planet level to see what district the sites are in. I would then have it that you clone jump to the Command Node of that district* and pick up a vehicle suitable to get you through whatever you are about to face (be it hordes or drones or stealing some plans and getting away as fast as possible if it was a mission) or scanning for minerals needed to make weapons in an appropriately fitted Dropship (here is a chance to introduce new vehicles however ccp). Maybe you have just jumped to a Lowsec/Nullsec planet simply because the map says there are people on the surface already and you want to snipe them for lolz. And I think if you can stay on the planet (much like you stay in a station when you dockup for the night or log off in a system in eve) so that you are still there the next time you log in this will go some distance to break the lobby shooter affect certainly for me especially if combat shows up in a similar way in the map). Of course that's all what I see the potential for based on what was said... he didn't actually say any that. *I say the District Command Node as i dont know if there is much more room for objects orbitting planets... unless its contested space... i mean in eve there are already POCO's and satelites I'm not sure an NPC warbarge as well might be too much clutter.
A+ Reading Comprehension. There is definitely the potential to do what you are guessing. Right now we are only focused on re-creating the universe and putting in the capacity to display a variety of information. Exactly what information we display at what level is not entirely known, so we are keeping it loose, or flexible. Also it is unknown to me if there will be a travel mechanic. This is not something we need to know to present the universe. If there is a travel mechanic, or something that prevented access to the starmap, we could still support it--likely we would add another state to our nodes, something like "locked" or "inaccessible". This would allow us to render it differently so it is apparent to the user that you can't go there. But I believe the importance to take away is this map should be easy and quick for the user to navigate through. |
|
|
CCP MC Peanut
C C P C C P Alliance
274
|
Posted - 2014.05.20 15:33:00 -
[3] - Quote
This week I am looking at transition effects in and out of the starmap, as well as a scan effect. The transition should fade between the current Station Level (you saw a 'possible but not final' representation at fanfest) and the starmap (the levels are loaded together so there is no actual 'loading' going on--just want some visual immersion). There are definitely a few ways to do this: standard cross-fade, pixel 'redrawing', holographic projection, ???. If you have ideas, or cool reference videos--that would be cool--please post (especially if you find informative Unreal Engine tutorials). The scan effect is already kinda cool (but there is room for more prototypes): Currently I've got a glowing (different colored) line panning across the 3D scene (driven by a 3D coordinate that would animate) that 'displaces' (and changes color) the geometry it passes over. The displacement may be overkill, but it was an opportunity to try new DX11 Geometry Tessellation features. |
|
|
CCP MC Peanut
C C P C C P Alliance
295
|
Posted - 2014.05.21 13:55:00 -
[4] - Quote
Monkey MAC wrote:Damn, you use python code? Awsome, can I come and work with you guys for the summer? As for the starmap, I like the iteration shown in the keynote, however a few view modes would be useful.
I've been a big fan of python ever since they embedded an interpreter inside of Maya. Because of this, I actually tend to favor developing tools inside of Maya rather than Unreal. I find the velocity of developing python inside of Maya to be super fast, and the stability is very high. Do you happen to be in Shanghai? |
|
|
CCP MC Peanut
C C P C C P Alliance
295
|
Posted - 2014.05.21 14:16:00 -
[5] - Quote
While I seem to be giving daily updates, I can't promise I will be able to keep with it, but so far, the responses.. the feedback--it is just exciting. Today CCP BaoZi told me he read someone post (I checked but couldn't find--maybe it was another thread) something about being able to still navigate the starmap while scanning, rather than have the controls locked out. The initial setup/plan was to have the camera be controlled by code and move around the Starmap, similar to the loading screen you saw at Fanfest. He thought this request seemed more than reasonable and do-able, so we agreed today to change the course. Instead he is just going to pass a kismet event to me that contains the region being scanned, then I will (in Kismet) grab the [region] position and animate in a material effect at that coordinate. I can probably re-purpose the scan effect I had before, but have it radiate from region centers rather than pan down the y-axis. The scanning is purely cosmetic at this point; it will need to run while information downloads in the background, but we see no reason why you can't continue to interact with the map. All this is subject to change pending the feedback we get from other devs, but I wanted to thank, whomever suggested it.
Also, the feedback on the transition is really cool, but I may have confused some of you. I was only referring to transitioning between the Station and the Starmap (and back) --not into battles. Going to battles requires a loading screen because it is a totally different map file (that must load)--the Station and Starmap are actually always loaded together (and on top of each other). We were just looking at clever ways to fade one out while we fade the other in. Today I actually did a test using a similar technique we use to fade out the suit with the cloak--not bad. It sort of felt like all the pixels detached from the screen, swirled around, then re-settled. This technique uses a 2nd material that is overlayed on top of all current materials. That overlayed material can sort of 'intercept' the current material's channels and make changes to it. In this case, it plays with the alpha and emissive channels.
|
|
|
CCP MC Peanut
C C P C C P Alliance
338
|
Posted - 2014.05.22 13:00:00 -
[6] - Quote
I'm a little depressed that I can't find any Kombucha to drink here in China, so I'll get my detox in by posting another daily Starmap update:
Today I hooked up the scanning effect using an event that CCP BaoZi has setup to fire in kismet. It was super quick to get working and very easy to make adjustments on. The event fires about every frame (not sure exactly, but it is pretty frequent) and outputs 2 values: a vector and an integer. The vector is the location the scan is happening at. The integer (0-1) is how close it is to being done (then it will switch to another random point and so on until scanning completes). I adjusted the material to radiate a wave of distortion and color out from the scan point. The way it works is that the material pays attention to the 3D point--if the distance between the pixel and the point falls within a tolerance it will blend in a different color and blend in a displacement to the vertex. The cool thing is that in the Starmap, almost everything inherits from one base material, so I only need to update one material with kismet, but I can override certain parameters on the child materials so the effect can differ slightly when it runs over a star versus a planet (and so on).
Also I tuned some of the material settings at different view levels to help with readability. Depending on the angle you view the Starmap, the background can be quite busy, so we are playing with adjusting translucency, scale, and brightness based on distance. |
|
|
CCP MC Peanut
C C P C C P Alliance
356
|
Posted - 2014.05.23 00:35:00 -
[7] - Quote
Captain Crutches wrote:CCP MC Peanut wrote:The integer (0-1) is how close it is to being done (then it will switch to another random point and so on until scanning completes). I hate to sound nit-picky, but that sounds like a range in between 0 and 1, so wouldn't that make it not an integer? Unless it's just one that flips between the two like a boolean...
Yea you are right--that isn't an int; it is a float. I don't know why I wrote int. I'm just a part time programmer. |
|
|
CCP MC Peanut
C C P C C P Alliance
357
|
Posted - 2014.05.23 01:39:00 -
[8] - Quote
steadyhand amarr wrote:CCP MC Peanut wrote:I'm a little depressed that I can't find any Kombucha to drink here in China, so I'll get my detox in by posting another daily Starmap update:
Today I hooked up the scanning effect using an event that CCP BaoZi has setup to fire in kismet. It was super quick to get working and very easy to make adjustments on. The event fires about every frame (not sure exactly, but it is pretty frequent) and outputs 2 values: a vector and an integer. The vector is the location the scan is happening at. The integer (0-1) is how close it is to being done (then it will switch to another random point and so on until scanning completes). I adjusted the material to radiate a wave of distortion and color out from the scan point. The way it works is that the material pays attention to the 3D point--if the distance between the pixel and the point falls within a tolerance it will blend in a different color and blend in a displacement to the vertex. The cool thing is that in the Starmap, almost everything inherits from one base material, so I only need to update one material with kismet, but I can override certain parameters on the child materials so the effect can differ slightly when it runs over a star versus a planet (and so on).
Also I tuned some of the material settings at different view levels to help with readability. Depending on the angle you view the Starmap, the background can be quite busy, so we are playing with adjusting translucency, scale, and brightness based on distance. What night school course do i need to take to make this kind of thing my job :-P
Actually Unreal Engine is very accessible and there is tons of information/tutorials/videos out there. I regularly benefit from this information. Polycount.com and the Unreal Forums feel the best. Also it is cool [that it is so accessible] because you don't run into a Catch-22 with needing experience to get a job but can't get a job without experience. Mutually beneficial for all.
Also, it feels like quite a few people, based on their postings, have done the above, which is super cool |
|
|
CCP MC Peanut
C C P C C P Alliance
378
|
Posted - 2014.05.23 14:45:00 -
[9] - Quote
Talos Alomar wrote:side note, what is your favorite baozi recipe? It seems like it must be important to you. I've always just used this one.
My favorite is the 'yet to be discovered' bacon, egg, and cheese baoZi. Please help locate. |
|
|
CCP MC Peanut
C C P C C P Alliance
421
|
Posted - 2014.05.27 01:18:00 -
[10] - Quote
Kain Spero wrote:Would it be possible to keep the current 2d star map as a view mode?
I've played Eve for 9 years and actually find the 2d start map incredibly useful (when it was released there was actually a lot of clamor in the eve community to get it added in eve as well).
This is not something we are focused on now, but it is does not mean we can't add this functionality in later. If we were to do it, we would also be able to benefit from chatting with the EVE guys about it and knowing in advance the tricky things we would have to deal with. |
|
|
|
CCP MC Peanut
C C P C C P Alliance
427
|
Posted - 2014.05.28 13:08:00 -
[11] - Quote
steadyhand amarr wrote:Wwhhhoooo :-D man i cant wait for legion lol simple quality of life stuff is so important and game needs to be fun at core or their is no point in its meta :-)
I look forword to seeing the finished product
We are having a lot of fun working on this feature too. Today I setup the orbit asset for BaoZi to place in solar systems. The asset itself is just a low poly torus mesh. The tricky part (or part that took more time than it should have) was getting a vertex shader set up that would make sure the thickness of the line is the same no matter the scale of the orbit. I ran into a snag where I didn't realize the vertex shader was taking into account the drawScale of the actor until CCP BaoZi took a 2 second glance at my screen and mentioned it. Disaster Averted! Also, we are going to try and have the collision mesh of the orbit be a bit larger so you don't have to struggle to line your cursor up with 1 pixel wide line. |
|
|
CCP MC Peanut
C C P C C P Alliance
427
|
Posted - 2014.05.28 14:05:00 -
[12] - Quote
steadyhand amarr wrote:Their is small side of me nerding out that i understood that :-P, i wish i took the gfx modul at uni but i cant draw for @&;_& :- so i thought it was a waste of time -_- but it sounds like its all maths and just playing with what artests give you. Is that a fair assesment?
Also its awsome to see devs who love their work :-P
I don't consider myself especially amazing at either visuals or math, actually. But I'm ok at both, so it's enough I guess. When I do the shaders I use the node-based material editor (you can do it in a code mode, though, but in this case I like the visual editor). Also working this way has actually improved a lot of my mathematical understanding for how a lot of shader stuff works--it is just really easy to setup a test case and preview it. If I'm not exactly sure how the camera world position works, I can take the node and plug it into a diffuse, then move the camera around and watch it change in real time.
This is epics documentation. Pretty much exactly what we have available to us. http://udn.epicgames.com/Three/MaterialsCompendium.html |
|
|
CCP MC Peanut
C C P C C P Alliance
433
|
Posted - 2014.05.29 11:19:00 -
[13] - Quote
Aeon Amadi wrote:CCP MC Peanut wrote:steadyhand amarr wrote:Their is small side of me nerding out that i understood that :-P, i wish i took the gfx modul at uni but i cant draw for @&;_& :- so i thought it was a waste of time -_- but it sounds like its all maths and just playing with what artests give you. Is that a fair assesment?
Also its awsome to see devs who love their work :-P I don't consider myself especially amazing at either visuals or math, actually. But I'm ok at both, so it's enough I guess. When I do the shaders I use the node-based material editor (you can do it in a code mode, though, but in this case I like the visual editor). Also working this way has actually improved a lot of my mathematical understanding for how a lot of shader stuff works--it is just really easy to setup a test case and preview it. If I'm not exactly sure how the camera world position works, I can take the node and plug it into a diffuse, then move the camera around and watch it change in real time. This is epics documentation. Pretty much exactly what we have available to us. http://udn.epicgames.com/Three/MaterialsCompendium.html Not gonna lie, I love the damn noodles in Blender
I've never used blender actually. I started with 3D Studio Max, then learned Maya. I tend to prefer Maya mostly because it has python available to it. Max is ok too, and there are some things it does much better than Maya. Recently I've been interested in Houdini because it seems to do a really good job of blending in nice procedural elements, which greatly improve efficiency. CCP Android was using it in some of the new terrain technology--maybe he can pop in here and talk about it. I'm thinking maybe I'll write a proper dev blog on Maya pipeline stuff. |
|
|
CCP MC Peanut
C C P C C P Alliance
433
|
Posted - 2014.05.30 01:19:00 -
[14] - Quote
Aeon Amadi wrote:CCP MC Peanut wrote:Aeon Amadi wrote:CCP MC Peanut wrote:steadyhand amarr wrote:Their is small side of me nerding out that i understood that :-P, i wish i took the gfx modul at uni but i cant draw for @&;_& :- so i thought it was a waste of time -_- but it sounds like its all maths and just playing with what artests give you. Is that a fair assesment?
Also its awsome to see devs who love their work :-P I don't consider myself especially amazing at either visuals or math, actually. But I'm ok at both, so it's enough I guess. When I do the shaders I use the node-based material editor (you can do it in a code mode, though, but in this case I like the visual editor). Also working this way has actually improved a lot of my mathematical understanding for how a lot of shader stuff works--it is just really easy to setup a test case and preview it. If I'm not exactly sure how the camera world position works, I can take the node and plug it into a diffuse, then move the camera around and watch it change in real time. This is epics documentation. Pretty much exactly what we have available to us. http://udn.epicgames.com/Three/MaterialsCompendium.html Not gonna lie, I love the damn noodles in Blender I've never used blender actually. I started with 3D Studio Max, then learned Maya. I tend to prefer Maya mostly because it has python available to it. Max is ok too, and there are some things it does much better than Maya. Recently I've been interested in Houdini because it seems to do a really good job of blending in nice procedural elements, which greatly improve efficiency. CCP Android was using it in some of the new terrain technology--maybe he can pop in here and talk about it. I'm thinking maybe I'll write a proper dev blog on Maya pipeline stuff. You should check it out. I like Blender (mostly because it's free!!!) but it does have python in it as well. There's even presets to change the layout to emulate other programs like 3D Studio Max and Maya so it's a pretty seamless transition. It also has a built in game-engine preview window, obviously not as powerful as something like the Unreal Engine but it does allow for some round-about ideas as to the limitations of the stuff. It's early stuff and I've absolutely no experience but I did experiment with it [Blender] a bit. Here's the results: http://i.imgur.com/WCvzraG.pnghttp://youtu.be/8mx9Q2VUg84http://youtu.be/KyQ1b1hOu_kThis bit isn't mine but it's a short that was made by a guy name Kaleb Lechowski in 7 months using Blender and now the guy has a feature-length movie in production. I think it shows a lot of what Blender's potential is and just how powerful it can be for a dedicated individual, especially surprising considering that it's freeware. http://youtu.be/xQuGpW0NuW8
Your videos are super cool--I love the breaking glass. I've seen that trailer before, but I didn't know they were doing a full movie (feels sort of Battlestar Galactica/Skynet-ey--not a bad thing). I'll definitely keep an open mind towards Blender. That it also has Python is a big deal. Previously I worked at a company that had their own engine, and they had no editor software (like Unreal Editor), so they relied on Maya for doing everything. With Python integration, it made it easier and faster to build tools. We even added a library system, much like the unreal content browser (but more primitive). |
|
|
|
|