- 11
- 97 458
Nathan Naerts
Belgium
Registrace 4. 12. 2011
Real time visualisation of the orientation of my smartphone in Blender
Each smartphone has multiple internal sensors that will detect how the phone is used in it's environment. I used these sensors to control the orientation of an object in UPBGE, a forked version of Blender.
The objects orientation is adapted in real time through the setup of a TCP client-server connection between a python script and app on my phone.
files and code: github.com/NNaert/Phone_IMU_Blender_object_control
SensorStreamer: play.google.com/store/apps/details?id=cz.honzamrazek.sensorstreamer&hl=en&gl=US
UPBGE: upbge.org/#/
The objects orientation is adapted in real time through the setup of a TCP client-server connection between a python script and app on my phone.
files and code: github.com/NNaert/Phone_IMU_Blender_object_control
SensorStreamer: play.google.com/store/apps/details?id=cz.honzamrazek.sensorstreamer&hl=en&gl=US
UPBGE: upbge.org/#/
zhlédnutí: 492
Video
Improving my home with 3 simple automations!
zhlédnutí 230Před rokem
All of the following automations are triggerd by the power consumption measurements of certain relevant electrical circuits. 1: Automatically start playing music on the wifi speaker in the kitchen when somebody starts cooking. The sudden power surge/drop produced by the hotplate or oven on the electrical circuit will control the speaker. 2: Count the amount of coffee's I drink during the day. T...
Reducing energy costs with Home Assistant!
zhlédnutí 854Před rokem
In this video I explain how I use Home Assistant to turn off our heating system when nobody is present in our home. This smart heating solution will hopefully reduce our heating energy costs by 20%. But only time will tell and I will make an update in a couple of months. The presence detection will look for connected devices(phone, laptop, sonos) on the local network to determine when somebody ...
Energy monitoring system with Home Assistant
zhlédnutí 17KPřed rokem
Monitoring the energy consumption of our home with Shelly devices (1PM and 3EM). The goal of this monitoring system is reducing/optimizing the energy consumption of our home. The sensor data is gathered and pre-processed by Home Assistant (HASS). Dashboards and visualizations with measured entities were created on the HASS platform. The Shelly devices can be used as a connected relay switch to ...
Optical pick and place robot arm with ArUco markers: part 2
zhlédnutí 12KPřed 2 lety
I extended the functionality of my Python controlled robot arm. With the use of ArUco markers, I was able to eliminate entering the coordinates of the position of the object. The pick and place robot arm now works based on optical feedback from my phone camera. Check out the first video on how I controlled the robot arm with Python. czcams.com/video/6IKAJicbArA/video.html&ab_channel=NatanNaert ...
Python controlled robot arm: part 1
zhlédnutí 59KPřed 2 lety
With the help of inverse kinematics, I was able to make a simple pick and place machine. In this version of the software, it's still necessary to define the coordinates of the piece of foam manually. I am planning to eliminate this and use video images to define the exact location. Code: github.com/NNaert/Python-controlled-Braccio-robot-arm Tinkerkit Braccio robot arm: store-usa.arduino.cc/prod...
Cable robot with Odrive motor controller and Mediapipe: part 3
zhlédnutí 684Před 3 lety
The previously made cable robot can now be controlled through hand gestures. The hand position is tracked with Mediapipe through a python script. The relative hand position is then mapped to the position of the end-effector. The cable robot is made with an Odrive motor controller.
Cable robot with Odrive motor controller: part 2
zhlédnutí 512Před 3 lety
Second side of the robot is now finished. 24V Odrive motor controller with 1.9kW brushless DC motors. For the position, velocity and torque control, a python script was used.
Cable robot with Odrive motor controller: part 1
zhlédnutí 896Před 3 lety
First side of a 2 DOF cable-driven robot. 24V Odrive motor controller with 1.9kW brushless DC motor. For the position, velocity and torque control, a python script was used.
Fixie compilation-top 30 fixie's
zhlédnutí 5KPřed 11 lety
This is a compilation of 30 fixie's, comment the fixie which you find the nicest! pictures: from google music: document one-stay
I need a high perform laptop for it ?
please where can we find prev_teta.txt it's not linked in the repository
Can i get the code??
Can you let me know what is the list hardware is used in this project, thanks sir
hi man, We are from Turkey. We are doing our Senior design project. We confused a lot and stucked in so many. If you would help us about our project. We would be so thankfull. We are waiting for your response. ps: Our topic is the exactly what you did in the video.
What is the problem?
@@NathanNaerts can ı add you on discord sir?
@@waveboardteamturkey sorry i'm not on discord
If you have any communication app or social media, I can add you. We need help and would like to accept your assistance.
@@NathanNaerts if you have any communication app or social media we would like to meet up with you. We need very much help.
how do you deal with the height of the object? Or did you just set a Z coordinate to make sure to get the piece?
Indeed, the Z-coordinate is a parameter in the code.
@@NathanNaerts Thanks you, i am working in a project where I need to get the XYZ coordinates in real world from a single camera. I didnt know about arUco markers... something interesting
I am working on my graduation project: “automation of fusible insertions by a robot.” When I follow this work, can I control the robot automatically to know each piece and place it in its appropriate place?
It should work
Plz where i find the description of this code
@@chakerbellili2441 There is no full documentation for the code. There is some explanation in the code itself. See description for the code.
how did you do the inverse kinematic to the arm?
Hi dear,mice video~Would u liek to review our smart energy meter?we will provide free sample
This is a great project. I love it. I would like to integrate it into a robotics course. May I know the dimension of the wood board and also the measurements of grid you set it up. are 100, 200 and 300 marked on the board 100mm, 200mm, 300mm?
Why do the servo motors make so much loud noise?
"prev_teta.txt" file are missing i can't complete the project without this file.. please can you provide the file
You can just create an empty text file with the same name
Geweldige video Nathan, dit is mijn goal in mijn huis. Ik ben nu een maandje of 2 bezig met HA en research doen. Mijn HA also runs in a docker container. De enigste hardware die je hebt laten zien is Shelly hardware en de solar outlet maar hoe heb je dan precies per apparaat het verbruik in HA? Ik heb nu 3 TuYa smart sockets tussen de outlets zitten van mijn kantoor PC, PC van mijn vrouw en wasmachine en die krijg ik ook te zien in mijn standaard energy dashboard. Heb jij ook smart sockets of alleen de Shelly devices? Zo Ja wat raad je aan? Ik weet nog niet hoe ik de dashboard kan aanpassen en editen daar moet ik nog naar kijken, maar wat jij nu hebt is mijn goal. Laatste vraag, heb jij nog iets speciaals gebruikt om je dashboard te maken of is dat allemaal met de standaard functionaliteiten van HA?
Hallo Nathan zelf zou ik ook graag zo een energie dash bord bouwen maar weet niet hoe ik hier aan moet beginnen. Heb jij tips?
Je kan simpel beginnen met bijvoorbeeld je digitale meter uit te lezen? Daarvoor zijn er gele en goedkope producten
Hi there. was looking for similar project. after monitor whole house electrical usage, can we convert into actuall expense and display onto dashboard realtime. base of price per Kwh For example : my country calculation first, 25 Kwh unit price is at 0.00355 second, 26-100 kwh unit price is at 0.00422 third, 151-300 kwh unit price is at 0.00815 and so on how can we monitor power useage and calculate this and show on dashboard realtime? sorry for my poor English.
Hi, take a look at my other video. I track elektricity cost in that video.
Cool! Just one mistake. IMU is “inertial measuring unit” not internal, although most IMU are hosted internally.
how should i run the code?
Can i use regular 6 dof robotic arm...??
🔥🔥
Hi, I’m working in a project with April tags for a robot, I’m in a FRC team and I’m interested if we can get some of your help, because the use of Aruco markers and Apriltags is a very new topic for us. Thank you so much!
Sure, take a look at my code in the comment section. If you have any questions, please ask :)
Great project. Do you think in use opencv?
I use it in part 2
so good but no module from bge
Great work! I can imagine some interesting possibilities with this setup. Did you continue to develop this?
No, If you have any suggestions? :)
I do the same for the washing machine. In addition, I figured out the different washing phases based on the power consumption. And last but not least, I reused some very nice pictures for the washing phases that I found on the community website of Home Assistant: search for the topic "Show off your picture-elements uses" and search for "washing machine". Enjoy.
Your "solar only" outlet is a brilliant idea. I also use the Shelly 3EM in the main switchboard and some Silvercrest smart plugs with energy monitoring (11 € from Lidl). These plugs are connected via Zigbee. To charge the car, I made 3 options: - direct start (uses sun + grid) + I can set a minimum kW at which it continues to charge (eg. 1,5 kW), even when there is not enough injection. When more sun power is available, the charger increases the charging power (sometimes even 8 kW) - sun only: will only charge when there is an excess / injection. It will stop charging when there is not enough injection. - timer based (typically to charge at night and to obtain the desired SOC by 6 AM)
Nice! and which car charger do you use? I assume it's a connected charger since you can control the charging Amp's.
@@NathanNaerts An Alfen Eve Single Pro-line. I connected it via a LAN cable and manage it from Home Assistant via the Modbus protocol.
Falló el script Python, comprobar el mensaje en la consola del sistema (UPBGE)
Nice Dashboards Nathan ! Are you using apex-charts (like for the 'Electriciteitsbelasting van huis op het net' graph) ? Would you be able to share the code you used for that graph ? Thanks !
and are you using some kind of dynamic retrieval of your electricity prices or do you set a static value ?
@@TTompi No, I was not able to find a good API to get dynamic electricity prices. So I created an 'input number' where I can input my current energy price. I use this variable in my cost calculations. This is the code for the price valriable, just add it in the 'configuration.yaml' . There are also charts for this input number, so it's easy to change the value from the dashboard: input_number: box_dagtarief: initial: 0.57 min: 0.2 max: 1 step: 0.01 mode: box icon: mdi:currency-eur unit_of_measurement: 'EUR/kWh' name: Energie prijs (dag tarief) box_nachttarief: initial: 0.57 min: 0.2 max: 1 step: 0.01 mode: box icon: mdi:currency-eur unit_of_measurement: 'EUR/kWh' name: Energie prijs (nacht tarief)
Yes indeed, Iam using Apex-charts. This is the code: type: custom:apexcharts-card update_delay: 3s update_interval: 1min graph_span: 24h hours_12: false header: show: true title: Elektriciteitsbelasting van huis op net show_states: false colorize_states: true yaxis: - id: first decimals: 0 apex_config: tickAmount: 4 apex_config: annotations: position: front yaxis: - 'y': 0 strokeDashArray: 15 borderColor: '#2647de' borderWidth: 2 all_series_config: stroke_width: 2 float_precision: 0 curve: smooth type: area unit: W group_by: duration: 30min func: last experimental: color_threshold: true series: - entity: sensor.totaal_verbruik_3fase_power name: Belasting huis color_threshold: - value: 4000 color: rgb(225,6,0) opacity: 0.8 - value: 2000 color: rgb(225,122,0) opacity: 0.8 - value: 0 color: green opacity: 0.6 - value: -2000 color: blue opacity: 0.8
I found an aruco cube 4x each side Is it worth anything?
Nice. I wonder how you do the math to handle the power peaks or drops in real-time.
Thanks, it is just a series of triggers and conditions that will increase a counter. A second way to tackle the detection of drops and peaks is to create a derivative sensor. A large derivative will indicate a sudden peak or drop. It will always have a small delay compared to real time, but nothing major. Depends to the sample rate of your sensors.
Have you used ROS? Do you recommend pure python over ROS?
No I haven't. It's on my to do list to try it out.
I want to learn programming Which language is better C++ or Python
I like python more. You can find many courses online and don't need any hardware to program something cool.
Brother I want to get why have you written one-twentieth of code using C++?
Just to communicate with the Arduino. The arduino receives input from the python script to send the motor commmands.
@@NathanNaerts are there nice courses explaining how to program in this field?
I recently used DatacCamp to learn a bit more about using Pandas. I can recommend the platform, it is really good!
Waw, dit is echt schitterend !!
What type of micro controller are you using and what programming ide
I write the code in Python (VScode). The microcontroller is an Arduino.
Worked , thanks a lot!
Hi Natan, can you please tell me where to find the "prev_teta.txt" file? I'm having some problems running the code due to this file, that apparently my pc cannot find.
Plz share me code 🥺🥺 tell me where I find code of this???
It's in the description.
Plz Where can I find this description ?
There are some important mistakes, that I saw you on your first video and still present several months later. Going further than using an arduino to send gcode to move servos or testing opencv, probably people here expect some work on the arm itself. The first section of the arm has to move in the opposite direction of the object to keep the center of mass near the base. The maximun distance is not what you get with all arms in line unless you add some counter weight, this is so wrong for that poor toy. You are making the arm even worse than the construction of the toy is itself. Second the movement need to have a PID control of the speed to avoid that shaking at the end of the move. The backslash can be correct if the last move of the 3 sections are possitive in direction of the object (not the joint to the arm), and not just moving all at the same time and don't calculate the finish time. Also when lifting an object, you cannot rotate just the base and lift it. You have to split the movements at least in 2 joins to lift and use more in the joins closer to the object than the base or better to spread the movement in the 3 joins with some weight calculation to avoid the shake even more. No excuses it's just foam. Also you are not lifting the object, just rotating it, and the object should lift straight and land straight witch is really easy to program. Instead of playing with opencv with cards that we are boring to see, use it to calculate if the finish move of the arm before dropping the object is correct. This is also done in the industry a lot, to have cameras to center with precision. After that adjust, check again and drop the object. This is the kind of things you probably need to figure out instead of running code fast and record a video to youtube. The 4:07 optical compensation is also wrong, it is just to calculate the height of the object, not a "optical issuse". Maybe you tried to talk about perspective but nothing in the drawing aims for that. Optical issue are related to the distorsion of the lens witch you didn't calculare, and probably doesn't matter for the precision it has. The poor code its even present when running the arm in the same loop that camera uses to refresh that makes it stop working, use threads instead that are also 2 lines in python. Get the ABC of robot arms before showing yours "move". Pretty much all you saw on the video it's useless in real scenario. Most of the people will say, oh you are a genius because they dont even know how to address it, but you have to focus more on the work itself than uploading videos of nothing. At least make the 300$ cost of the arm get some return in usufull concepts, otherwise you will learn the same with just a regular 2$ clone arduino, a 3$ pair of servos more spair foam, and a bot in a videogame with opencv. Sorry for the comment, i hope it helps more than it hurts.
thanks for the feedback
ARMS:whats the purpose of my life? Me: you move stuff
A question: is there a way to know the value of the force that the gripper should exert in order to correctly grasp the object?
Yeah it's possible, but not with this setup. The gripper is actuated by a cheap servo which can't be force controlled. A robot arm with such actuators, sensors and accuracy gets expensive real quick. You should look up: closed loop force control robot arm
You could use pressure switches/ sensors on a feedback loop on the fingers the robots I have used had them to ensure correct grip. Some of the simpler ones had microswitches and servos set to torque.
🌈 Very well done. Great job. I say too❗ Continue... 🙂🙈
👏👏👏👏😊
👌👏👏👏👏
👍 pքɾօʍօʂʍ
Great job! I am currently working on a project of my own robot, I started with the simplest thing, I programmed forward kinematics. The next step is inverse kinematics.
Keep at it pal, IK is a satisfying nut to crack. You'll never be happier to see something move in a straight line!
@@tombackhouse9121 power of maths
is it possible to link the coding u used i am stuck in the coding of my robot @BoGu
Code is linked but missing BraccioRobot.h and Position.h?
I don't think you need this in order to make it work? Or where do you see this?
@@NathanNaerts in your github you linked, in the folder with the .ino file that you upload to the arduino. It includes these files but they are not in the github?
@@Pyramid1501 Ah sorry, yes indeed. You can add these libraries through the 'library manager' of your python IDE. docs.arduino.cc/software/ide-v1/tutorials/installing-libraries
@@NathanNaerts yeah thanks for the tutorial but where can I download these libraries
Geweldige video! Helaas lukt het mij nog niet om kWh om te zetten in euro 😟 zie ik iets simpels over het hoofd of is dit nog wel ingewikkeld?
Je moet eerst een 'utility meter' creeëren die het uur/dag/week/.... verbruik bijhoudt in een nieuwe entiteit. Ik heb dit nog eens opgesplitst in dag/nacht tarief. Deze nieuwe 'virtuele' sensoren kan je dan gebruiken om te vermenigvuldigen met euro/kWh. Dan bekom je de energiekost voor een bepaalde periode. Het beste is om dit allemaal toe te voegen in de 'configuration.yaml' file.
Good video!. I have a similar robot arm, but I don't know how to start with inverse kinematics
NICE work!!!
How did you compensate the play in the components? Are you able te get encoder data from the servos? Any other compensations done? Thanks :)
No, i did an 'optical calibration' since the servo closed feedback postioning loop was off. I tried to match my positiong command to real servo angle that was created. E.g. o motor position of 93 degrees resulted in an effective arm position of 90 degrees. So, each time i needed an angle of 90 degrees, i write 93 to the motor.
@@NathanNaerts Did you use optical encoder .
@@NitinSharma-so9hg Not exactly. In this case, "optical calibration" means doing it aproximately by eye.