Jump to content
bigd

Runtime Level Editor Thread

Recommended Posts

Hi Karl -

I figured I'd just create a new thread around the topic of the runtime API.

I started diving in today and I figured the first problem I would tackle is creating a 3D translate / rotate / scale widget for manipulating probuilder object during runtime, similar to how it's done with the editor.

Seems like the first step would be to detect the center of the selected face and place the 3d widget there, and handle drag events for the actual movement of vertrices.

Any functions you can think of that would be helpful? PositionHandle2D looks promising...

Share this post


Link to post
Share on other sites

Getting the center of a face is as easy as taking the average position of all vertex positions belonging to it.   Off the top of my head I think there some functions in pb_Math that take a list of Vector3 and Int (positions and triangles) and return the average.  You could also calculate the center of the bounding box of the vertex positions.

Share this post


Link to post
Share on other sites

 

Hi,

I found this in the "HighlightNearestFace" example : 

/**
	 *	Returns the average of each vertex position in a face.  
	 *	In local space.
	 */

	private Vector3 FaceCenter(pb_Object pb, pb_Face face)
	{
		Vector3[] vertices = pb.vertices;

		Vector3 average = Vector3.zero;

		// face holds triangle data.  distinctIndices is a 
		// cached collection of the distinct indices that 
		// make up the triangles. Ex:
		// tris = {0, 1, 2, 2, 3, 0}
		// distinct indices = {0, 1, 2, 3}
		foreach(int index in face.distinctIndices)
		{
			average.x += vertices[index].x;
			average.y += vertices[index].y;
			average.z += vertices[index].z;
		}

		float len = (float) face.distinctIndices.Length;

		average.x /= len;
		average.y /= len;
		average.z /= len;

		return average;
	}

I also found this topic using the function, I hope it can help.

pb_Math.BoundsCenter(vertexOrigins);

Also, do you know if there is an easy way of getting the vertex position of the two vertex in pb.SelectedEdges[0] ? (I am trying to get the center of the edge, not of the face). 

And lastly, how do you get the position of the selected vertex ? Is there an equivalent of pb.SelectedFace and pb.SelectedEdges but for the vertex ? 

 

 

 

 

Share this post


Link to post
Share on other sites

To get the vertices associated with an edge just look them up in the vertices array. 

Ex:

Vector3 a = pb.vertices[pb.SelectedEdges[0].x];
Vector3 b = pb.vertices[pb.SelectedEdges[0].y];
Vector3 center = (a + b) * .5f;

 

Share this post


Link to post
Share on other sites

Hi Karl - 

Trying to not take up too much of your time answering small questions so I appreciate you being my training-wheels until I get the hang of the code-base a little more. If I get things working right, I'm considering making my runtime code public (with your guys' permission of course) to hopefully give other peeps more functionality than what currently exists in the example.

Played around with flipping the object normals today and noticed something strange, see below.

 

For the most part, I'm using:        

pbTriangleOps.ReverseWindingOrder(currentSelection.pb, currentSelection.pb.faces);
currentSelection.pb.ToMesh();
currentSelection.pb.Refresh();

Any idea why I'm getting a triangle selection (instead of face selection) after I flipped the normals?

Also, I'm noticing on some videos online, there is a transparent texture applied to the walls that have the normals facing away from the camera. (see below). How is this done?

 

Share this post


Link to post
Share on other sites
Quote

If I get things working right, I'm considering making my runtime code public (with your guys' permission of course)

That'd be great!  As long as you don't publish any ProBuilder source code that would be totally fine.  Just list ProBuilder Basic (or Advanced if necessary) as a dependency.

On why faces are selecting as triangles after flipping, it's hard to tell without seeing the code you're using to generate the highlights.

On the transparent walls, that's just the faces backs getting culled - unless I'm missing something?

 

Edit - I also found some code I wrote for a talk about writing 3d modeling tools.  It's incomplete, but you may find some of the code useful: https://github.com/karl-/simple_modeler/tree/master/Assets/Modeler/Code

Share this post


Link to post
Share on other sites
7 hours ago, karl said:

Edit - I also found some code I wrote for a talk about writing 3d modeling tools.  It's incomplete, but you may find some of the code useful: https://github.com/karl-/simple_modeler/tree/master/Assets/Modeler/Code

Good stuff! I'll crawl through this, thanks!

7 hours ago, karl said:

On why faces are selecting as triangles after flipping, it's hard to tell without seeing the code you're using to generate the highlights.

I'm pretty much just using the example Runtime API code, so: FaceCheck(Vector3 pos) and RefreshSelectedFacePreview() to generate the selected face highlight. I don't actually use these functions when I flip the normals on the faces, they're just called whenever I click the mouse on a face, if that makes sense. Any idea why they would only create the highlight for one triangle instead of the entire face after flipping the normals?

7 hours ago, karl said:

On the transparent walls, that's just the faces backs getting culled - unless I'm missing something?

Playing with this more, I think I asked the wrong question. It looks like when I use the editor version, there's a bounding box that is present around all the faces. See pic below, this doesn't show up when using probuilder at runtime, any idea how this is enabled?

http://imgur.com/H5Fh4UW

Thanks, 

Share this post


Link to post
Share on other sites

Ah okay, that's a custom wireframe I wrote for the editor.  I actually wrote a blog post about how I did that over here: http://parabox.co/blog/blog_entry.html?2015-01-28=wireframe.html

As for the bad face highlight, that can be fixed by not assuming the distinct indices == vertex order.

		void RefreshSelectedFacePreview()
		{
			// Copy the currently selected vertices in world space.
			// World space so that we don't have to apply transforms
			// to match the current selection.
			Vector3[] verts = currentSelection.pb.VerticesInWorldSpace(currentSelection.face.indices);

			// face.indices == triangles, so wind the face to match
			int[] indices = new int[verts.Length];
			for(int i = 0; i < indices.Length; i++)
				indices[i] = i;

			// Now go through and move the verts we just grabbed out about .1m from the original face.
			Vector3 normal = pb_Math.Normal(verts);

			for(int i = 0; i < verts.Length; i++)
				verts[i] += normal.normalized * .01f;

			if(preview)
				Destroy(preview.gameObject);

			preview = pb_Object.CreateInstanceWithVerticesFaces(verts, new pb_Face[] { new pb_Face(indices) });
			preview.SetFaceMaterial(preview.faces, previewMaterial);
			preview.ToMesh();
			preview.Refresh();
		}

 

Share this post


Link to post
Share on other sites

Hey, bigd!

I gotta say that your runtime ProBuilder looks pretty impressive so far and I'm interested to see your take on it! :) And I'm specifically interested in how you did the movement with the axis, like how the block extrudes with the axis. Are you using TranslateVertices or something else? I'm also in the process of making something like this for a game and I've just experienced it going off into space whenever I'm moving mine using TranslateVertices. :P

Would you mind sharing some info on how you went about yours?

Share this post


Link to post
Share on other sites

Hey there!

Yup, you got it, TranslateVertices() all the way. The difference between what I did and what's already included in the runtime example is that I use a variable instead of the float (1f and -1f) that's in there currently. Like so:

currentSelection.pb.TranslateVertices(currentSelection.face.distinctIndices, localNormal.normalized * distance);

To calculate that distance variable, I calculate the magnitude of the vector3 on the axis that the player moves. Like so:

distance = ExtVector3.MagnitudeInDirection(mousePosition - previousMousePosition, projectedAxis) * moveSpeedMultiplier;

Once I get things somewhat working overall, I'll be happy to post what I have for you to mentally digest. I have a few other functions I'd like to get working (and tons of cleanup). Hope this helps!

Share this post


Link to post
Share on other sites

Hmm.. actually updating the pb object with my axis turned out really tricky for me.

I've basically got 3 calls to do this in, OnBeginDrag (called when I click on one of the axis arrows), OnDrag(called every frame until I release the mouse button) and OnStopDrag(called when I release the button). According to my logic, I want to update it in OnDrag and from that, I get a new position (where the axis currently is), and everything I do here results in some very weird results (but all go into space). I don't really know the math for making the face vertices match up with the position of the axis. If you, or anyone else, would happen to know some way I could do this, it would be appreciated. But if not, I'll wait for you, bigd, to post your stuff and I'll take a good look on that. :) 

Share this post


Link to post
Share on other sites

@Hertzole Yeah - it's sort of hard for my to advise without seeing exactly what your script is doing. It should be as easy as just traslating the vertrices of the selected face in the X/Y/orZ direction by some amount. I say "easy" but really, it's not. The math is sometimes very difficult to visualize through the code to understand exactly what's going on. Your logic seems sound, my advice would be to use some debug statements to see what's going on to cause it act so violently.

@karl Question for you sir. After going crosseyed trying to make sense of how to grab the edges, I seem to be able to draw the lines of the edges but it destroys the original mesh. Looking through the blog, it looks like you actually have two meshes showing at the same time, one that has the faces, and another (hidden) one that shows the modified wireframe. Am I understanding that correctly? 

That sort of blows my mind a bit, but it makes sense. Looking forward, is it the case where all object manipulation (vertexes, edges) are selected and manipulated using the wireframe model and then sent to the mesh model (with the faces)? I'm beginning to think I should have jumped at the chance to buy probuilder when it was on sale so I can see exactly how you're doing this, instead of reinventing the wheel. :unsure:

Share this post


Link to post
Share on other sites
Quote

Looking through the blog, it looks like you actually have two meshes showing at the same time, one that has the faces, and another (hidden) one that shows the modified wireframe. Am I understanding that correctly? 

Yes, that's correct.  The actual mesh is never modified, and two additional meshes are rendered "on top" of it.  The edge and face highlight meshes are rendered using a shader that pulls the vertices slightly closer to the camera in clip space so as to avoid z-fighting.

Quote

Looking forward, is it the case where all object manipulation (vertexes, edges) are selected and manipulated using the wireframe model and then sent to the mesh model (with the faces)?

Not quite that far - the edge and face highlight meshes are just used to show the user where things are.  The vertices and triangles that are used for operations are still pulled from the mesh.

Share this post


Link to post
Share on other sites

@karl Still plugging away, making great progress actually, but a long way to go. Your other posts on this forum have been pretty helpful. I was curious how you tackled face rotation and scaling. I'm guessing your functions for this are tied pretty close with the editor rotation and scale handles, but was wondering if there were any functions that rotated the vertices nicely around a point?

Share this post


Link to post
Share on other sites

There aren't any that I remember off-hand, but the gist of it is:

- Get handle rotation delta ( Inverse(previousHandleRotation) * currentHandleRotation )

- Shift selected vertices to local space (subtract each position by the average of all vertices)

- Multiply vertices by delta

- Shift vertices back

Share this post


Link to post
Share on other sites

Hi bigd, as Hertzole, your Runtime mesh editor looks very promising. 

My project is a physical 3d modelization tool, and I would like to simulate a 3d software through Unity. Your project looks really useful for me because I am struggling at each step to adapt solutions to the editor mode (in order to have probuilder, the handles etc...). 

Do you think you will be able to create basic modelization functions (such as extrusion?)

Have you decided wether you will publish your code ? If so... can I ask you when ?

 

Share this post


Link to post
Share on other sites

@Nyrow Face extrusion and translation, wireframe, adding material to a face, and flipping object normals are currently implemented. I'm currently trying to figure out face scaling, edge selection/translation, whole object translation/rotation/scaling/copy/pasting, and object serialization. After that, I'm not sure I'll go much further, even thought that's just scratching the surface of what Probuilder can do.

In terms of when, I think I'm still a few weeks away from getting something ready to publish. My code right now is full of comments, debug statements, and wouldn't be too helpful to anyone in it's current state.

Seems like there quite a few people like yourself interested in this, kind of makes me think I might need to set up a donation fund so I can purchase an actual license. :lol:

Share this post


Link to post
Share on other sites

@karl Question for you sir, when you have a minute.

Trying to figure out how to do snap faces to the nearest x,y,or z whole number depending on which axis you are pulling and having a little trouble.

See video below. Ideally, after freeforming, I'd like the face to snap the face to the nearest whole number. Right now, it just increases the size by one, regardless of the starting value. 

 

Here's what I have so far in terms of code. 

        private void MoveFaces(float distance, Vector3 axis)
        {

            Vector3[] verts = currentSelection.pb.VerticesInWorldSpace(currentSelection.face.indices);

            for (int i = 0; i < verts.Length; i++)
            {
                if (selectedAxis == Axis.X)
                    verts[i].x = Mathf.Round(verts[i].x * distance);
                if (selectedAxis == Axis.Y)
                    verts[i].y = Mathf.Round(verts[i].y * distance);
                if (selectedAxis == Axis.Y)
                    verts[i].z = Mathf.Round(verts[i].z * distance);

                Debug.Log(verts[i].x + " " + verts[i].y + " " + verts[i].z + selectedAxis.ToString());
            }
            currentSelection.pb.TranslateVertices(currentSelection.face.distinctIndices, axis.normalized * distance);
        }

I think the problem is that I'm not setting the vertices of the mesh after I move them in the for-loop, any suggestions? Thanks.

Share this post


Link to post
Share on other sites

Hi, first time posting.

 

I was wondering if anyone knows how to scale multiple faces around their centre point using the runtime api. I can get the desired effect in the unity editor (see image), but I need to achieve this at runtime. Any help would be greatly appreciated.

Runtime Api scaling on Center.gif

Share this post


Link to post
Share on other sites

@IAmMoore08 Hey there. While I haven't done multi-face scaling, I have been able to do single face scaling.

Here's what I believe the basics would be, you'll have to figure out the specifics because it'll depend on how your real-time transform gizmo works so you can integrate with it.

Placing your transform in the center of all selected faces

1) Get the average of all selected faces. This nifty function should be able to help with that. I would think for each face, you should have one Vector3 that's it's average.

private Vector3 FaceCenter(pb_Object pb, pb_Face face)

2) Get the average of all your face averages, place your scaling transform there.

Scaling selected faces

3) Now that your transform in placed, you'd need to detect how much the user is looking to scale by and by which axis. (I used a Vector3 and save the axis in the direction and the scale amount in the magnitude)

4) Iterate through all the distinct vertrices of your selected faces and scale based on the scale amount. Set the vertrices of the mesh afterwards.

5) ???

6) Profit.

That's pretty much the basics. It's not easy, although @karl might have a cleaner way of doing this. Hope this helps!

 

Share this post


Link to post
Share on other sites

@bigd there's actually a method in pb_Object_Utility that will handle snapping to world coordinates for you:


		/**
		 *	Move vertices in world space.
		 *	pb - The pb_Object target.
		 *	selectedTriangles - A distinct list of vertex indices.
		 *	offset - The direction and magnitude to translate selectedTriangles, in world space.
		 *	snapValue - If > 0 snap each vertex to the nearest on-grid point in world space.
		 *	snapAxisOnly - If true vertices will only be snapped along the active axis.
		 *	lookup - A shared index lookup table.  Can pass NULL to have this automatically calculated.
		 */	
		public static void TranslateVertices_World(this pb_Object pb, int[] selectedTriangles, Vector3 offset, float snapValue, bool snapAxisOnly, Dictionary<int, int> lookup)

 

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now

×