I have the following problem. I have a few variables from a PLC (that controls a PLC controlled conveyor belt) that i want to visualize/control with the Spatial Toolbox.
I already implemented the Hardware Interface with the IoTGateway and i see the values in the console.
I dont know how to proceed.
I want 3 Buttons and 1 Textlabel to control and visualize variables from the PLC when my conveyor belt is in view of the camera.
How do I do this? Do i link preexisting Buttons together to a central node that communicates with the hardware interface?
Can is use preexisting tools from the pocket or can i add my one buttons and labels with my own design? If yes, how?
you can create you own tools (buttons and labels) with your custom design.
In this tutorial you can check how to create a new tool from scratch. You can create your own html/css template for a custom design for your tool.
In order to establish the communication between the tools and the hardware interface you can use the hardware interface API. This tutorial has an overview:
In your hardware interface, you can add a read listener to read value changes from a node in a tool. For example:
server.addReadListener("feederStation", "scale", "measurement", function(flowDataObject){
var value = flowDataObject.value;
var unit = flowDataObject.unit;
});
Don’t hesitate to ask any other questions you may have.
Anna
Hi Anna.
Thanks for your response. This is close to what i thought
So in my case (2 buttons) those 2 buttons are two separte objects(tools) in the pocket and both have their own node name.
In the Hardware Interface .js-File i add two new “addReadListeners” but i change the “feederStation” string to the name of the node that i declared in the .html file of the tool?
Why cant the addReadListener that is already implemented in the hardware Interface file can’t do this? Because a addReadListener can only listen to one node at once?
Hi Matthias,
Yes, in your case each button or text label will be its own “tool” in the system. You can either use the pre-existing ones in the pocket, or if you’re feeling ambitious you can custom-design your own tools using the first tutorial Anna linked to.
In terms of connecting the buttons to your machine… if you have the KepServer IOT Gateway set up correctly the Vuforia Spatial Edge server’s kepware hardwareInterface should be able to automatically discover the tags from your KepServer using the GET /browse REST API. It will attempt to add a “node” for each of these to the system so that you’ll visually see a circle UI element in AR that represents each tag. And it will automatically add a “read listener” to each tag in the hardware interface so that the value of this node will equal the value of that tag on the machine. To connect to your buttons, you should be able to just look at the machine with your Spatial Toolbox app, add some buttons from the pocket, and draw a “link” from the buttons’ nodes to the machine’s nodes.
Unless you have some special hardware that our code doesn’t support yet, you shouldn’t need to adjust the code in the kepware index.js file.
Are you able to see the nodes appear for each of the tags on your PLC when you look at the machine with your Spatial Toolbox app?
In a previous version i was able to send and recieve data from and to my PLC. It works fine.
I managed to create 2 buttons which send 0 or 1 respectively and are connected to the same tag via the linking option of the spatial toolbox. I see the values changing on the PLC but i can’t click these buttons multiple times.
As i took a look at the implementation of the preinstalled buttons, following question came up:
What is the difference between the “realityInterface.write()” and “spatialInterface.write()” methode?
I tried various combinations of both with little to no success.
I use my Ipad in my Keyboardcase with screen rotation enabled and took the attached screenshot. Why is it upside down? The App does not rotate accordingly.
It is a minor inconvenience, i can deal with it
Currently, values will only be sent to nodes across links if the value has changed from before. So if you press an On button twice in a row it probably won’t do anything. However, if you toggle a button On -> Off -> On it should send the values. Is this scenario working? If not I can try to help further troubleshoot it.
realityInterface is just a deprecated name for spatialInterface. They are equivalent and you can use either of them. We’re using spatialInterface now but some code and tutorials still need to be updated to the new naming scheme.
Proper screen rotation is a bit tricky to do in an AR app, as the 2D UI elements must rotate separately from the camera background & AR content, so we currently have a half-solution that results in upside down screenshots, etc (if you’re holding it in that particular orientation). This is an issue we’re aware of and would like to fully fix in the future.
Addition:
I have two buttons “On” and “Off”. If i press On -> Off -> On the second On does not work.
I can press a button a second time only when i close the Spatial Toolbox app (its not completely closed, it is still running in the background) and reopen it.
To be more specific, my application is:
Those buttons don’t switch something on or off. These are two buttons that should move a conveyor belt left(0 is send) or right(1 is send). If the conveyor belt runs to the right and an item on it crosses a light barrier it stops. It should continue if i press the right button again (which doesn’t seems to be possible).
Starting, Stopping the conveyor belt and spin it left/right is already possible with the help of three buttons (The standard on/off button and two costum ones based on the randomColor tool which send 0 or 1 respectively). Now I want to add the light barriers.