Introduction
Our team consists of two members:
My name is Sertaç, I am a Gameplay Programmer. I have been developing projects with Unreal Engine for about 3.5 years. Although my job description is Gameplay Programmer, I enjoy UI (Interface) programming. In the meantime, I have worked and still work on different projects that have been published on Steam.
I also organize the Unreal Engine Meet-Ups in Turkey I organized numerous events in the past year and continue to organize them. Along with events, I am thinking about organizing seminars about game development with Universities and High Schools. There are many people who want to develop games in Turkey and I enjoy passing on the experiences I have made. Because information grows as it is shared!
I am also among the community developers of Unreal Engine version 4.18. Of course, being just a tiny piece of an amazing engine provides happiness and pride.
My name is Kemal, I am a 3D Artist. I have worked in the film and game industry in different areas for almost 10 years. Generally over modeling and lighting. I have more concentrated over lighting for the past 4 years. I have been using Unreal Engine for 3.5 years. I have found throughout this time the opportunity to work on different game projects and on many short- and full-length films. I am trying to combine the things that I learned from the cinema with what I learned in the process of developing games. I share all of what I learned as a class on my Artstation and Youtube pages when possible and try to help other developers
I also try to answer questions asked by participating in seminars and webinars and in events organized at universities.
About the Project
<iframe title="1 Minute on the Terminals" src="//www.youtube.com/embed/SRoe-P_5zoM?enablejsapi=1&origin=https%3A%2F%2Fwww.gamedeveloper.com" height="360px" width="100%" data-testid="iframe" loading="lazy" scrolling="auto" class="optanon-category-C0004 ot-vscat-C0004 " data-gtm-yt-inspected-91172384_163="true" id="117347888" data-gtm-yt-inspected-91172384_165="true" data-gtm-yt-inspected-113="true"></iframe>We are trying to explain with this article how terminal and puzzle systems, which are the “Core Mechanic” that we used within the TARTARUS game, work.We think it could be useful in terms of both being helpful for other developers and in showing what all Unreal Engine 4 and the Blueprint system can do. We will try to talk as much as possible about the problems and solutions that we came across. But we want to briefly mention the project before we come to terminal and puzzle systems.
TARTARUS is a fundamentally a writing-based, first-person game. Your task is to save both yourself and the ship that is crashing into the planet Neptune and whose systems are malfunctioning. We are also using the “Terminal” that we will mention later and some mechanical tools while doing this. We published Unreal Engine 4 along with our developed game on November 22.
We designed and implemented the Terminal System as it is below based on our own needs. But it is fundamentally applicable for all projects. The rest relates to how create you are.
Programming the Interface
I want to explain before everything else how processes are logically executed. Actually, if it is necessary to explain it in the simplest manner, it is formed over taking and processing inputs from the user. We created a Widget Blueprint especially for this.The most important section here is that we have to make some adjustments to be able to get user inputs. Otherwise the OnKeyDown function wouldn’t be able to capture the keystrokes.
After making the necessary adjustments and giving all focus on the interface, we can now capture the inputs coming from users.We will use the
The most important point here is us taking the names of the keys that the users pressed thanks to the Key Get Display Name function. We will carry out the necessary processes in the proceeding phases in this respect.
Any more we can get input from the user and write down whichever keys these are. It is actually important to divide the problem into parts in this manner. We successfully completed the first part of our problem.
We are going to process the inputs that we got in later steps and create a decision mechanism based on this.
We must primarily integrate the input that we obtained in order to be able to process it. We throw each pressed key to the variable in the String type that we created for this.
In this respect, we integrated the inputs that we got from the users. But we have a little problem at this point. The Key Get Display Name function returns values like Space, Comma, Period, Hyphen because it returns the names of the keys to us. So this is why we have to push keys like this through a process and convert them to values that we want.
We created a function for this that returns the symbol that corresponds to that key in place of the names of the pressed keys. We later integrated this into the function that we had already written.
The next step will be to process the commands that we received based on the standards we created and identify the flow based on this. But because the standards here will be different than the system that we want to make, I am going to briefly mention the standards that we used for Tartarus.
In the terminals that we developed for Tartarus we had some commands defined to the system and parameters for these commands. We were identifying these and carrying out the necessary processes based on the returned commands and parameters.
We are carrying out these processes the moment the users pressed the Enter key – the moment that they verified the command they wrote. We separate the commands verified by the user based on the standards that we specified.
Some commands can be used without parameters. We determined this, especially by whether or not there was a space used in the entered command. We later created a decision mechanism based on the two situations.
Like I said at the start of the article, the section I explained above can vary based on the system you want to create. You should therefore specify your own standards.
We will design the interface in the next step and make the system functional. But I first want to explain the working logic of the system.
It is formed as two Widget Blueprints, including the system interface and the command lines. The first is the MainWidget that the Widget in which the background and design side is found.The second is the ItemWidget in which we wrote the commands above. The command lines – the ItemWidget – is on the ScrollBox found on the MainWidget, and a new version is added the moment Enter is pressed.
What we need to do after finishing the design phase is process the commands after pressing Enter and then add a new ItemWidget to the ScrollBox.
The system works logically in this way. We are going to reflect our two-dimensional interface screen on a model in the scene after this phase.
Material
We wanted the system we designed to be changeable as we wish. And like this, we weren’t going to have to rethink for each “Terminal”, we were going to be more comfortable in the creative process, and we were going to struggle many fewer possible issues. Firstly, we needed a special “material”. At first, we needed to identify a type that is suitable for our universe. Lo-fi
Sci-fi became the key for us. We wanted the CRT screens… the Terminal screens to look bright, lively, and nostalgic. We are going to try and explain how we created this in a few steps.
Just like the “Generic” Materials, you saw above that we used for all Terminals. We wanted to keep it as simple and understandable as possible. Let’s look with a little more detail.
We first need to know the “visibility” limit on our Terminals. The “Capture Camera” that we placed in the scene should constantly watch a “surface” and should have shown the inputs of the users coming from here and where within the system hierarchy of the game and at what it is looking with minimum delay and maximum performance. To enable this, we appointed it to the “Texture Target” section that is one of the “camera” section by creating the “Canvas Render Target”. We later began to create our “Terminal” materials using this texture.
With this quite simple but effective route, we started to answer many questions like should we do the limits visible on the screen, our character limits, how much need to we have for solubility, and how should we do performance improvements. We later ended our materials by adding details.
The scratches on the glass screens of the terminals were provided with a “Roughness Map” as in the example. (Image source Google)
Another important parameter is the maximization necessary for the “Scanline” effect and that creates the illusion of the scanning on the screen. Thanks to this texture, along with acquiring a more aesthetic image that broke the monotony, we think that we captured the retro mood.
You can comfortably prepare your design drafts in the 2D program that you use. The area in the more bright tone on the upper side is the place that creates the actual designed scan effect. We are going to look a little later at how it works.
As you can see, we use 3 basic textures. Let’s look now at how and why we control these materials. We especially felt the need to control the sizes on the screen, especially of the “Roughness” and “Scanline” textures. They should neither be very big nor very small. We need to overcome the problem of the appearance of the fake “Ghosting” effect, which forms especially with the concentration of many design lines, like the texts disappearing late or becoming pointed. We also didn’t want to touch the texture connected to the Roughness value while doing this. We therefore went the way of dividing it into two sections.
The section you see on the upper left side is the section where we control how big the scratches on the “Terminal” screens are going to be. We repeat our texture with the help of the “Multiply” node, using the “Texture Coordinate” node and “Scalar Parameter”, and we are able to get the effect we want. Again on the upper right side, we repeated our “Scanline” texture and increase the number of lines by using the same nodes again, and just like we decreased it, we control how much we want it to move on which access and how fast we want it to move with the “Panner” node. Another important point is the fact that the Terminal screens are made from glass materials. But instead of the translucent or masked material properties that provide the classic permeable glass effect, we wanted to use “Opaque”, and we wanted to provide brightness with the Roughness value. Its preparation is as you see below.