Seriously
Honorary Master
- Joined
- Nov 29, 2012
- Messages
- 16,596
There you have it everyone. Well thought out debate philosophy recognised the world over is now "thought up BS." WOW. JUST WOW.
Lets see how well Swa thought.....
Most DC devices are low current. The only example is the PC but even there the situation isn't so bleak. Don't confuse a cable's AC current rating for its DC current rating. DC can handle more current than AC, in practice roughly twice as much. So a standard 15A cable can do at least 360W at 12V..
Thus 30Amps over a standard 2.5MM2 copper home wire. The minimum wire length in a typical home will be more that 10 meters
Thus I would love you to explain that little gem . But lets not fear as Swa is near and he also mentioned this gem
*sigh*
As I have pointed out the voltage drop is due to the limitations of the power supply and not the length of cable. Voltage drop through the cable is negligible and affects AC current as well..
Anyone can do the calcs. What size cable is required to draw 30 Amp's at 12VDC over 10M copper wire and what will the voltage drop be..... Hahahahah Swa failed again.
Now here is another one I enjoyed.
Most DC devices are low current. The only example is the PC but even there the situation isn't so bleak.The highest voltage in a computer is indeed 12v, but there are a number of 12v, 5v lines.
Modern CPU's can use 100w on their own, and higher end graphics cards are often in the 300w range.
Add in other bits, hard drives / ssd's etc, and you can easily hit 500w.
500w @ 12v = 41amps. PC power supplies have multiple rails, and supply multiple 12v over parallel cables.
Are you going to run twice the cable to supply one single computer. I have explained this a couple of times relatively clearly.
If you don't understand basic concepts, suggest do some further reading of why I keep pointing it out repeatedly.Yes you keep pointing it out but with no logic of physics behind it. I already explained this. Few PCs use 500W and those that do don't do so constantly as well..Um no.
Impedance is what drives the voltage drop, not the source.
I specifically asked a question which would have led you to the right answers -
Power (Watts) = Voltage x Current (Amps)
500w / 12v = 40Amps.
Go visit here -> http://www.calculator.net/voltage-drop-calculator.html
House wiring is typically 12 gauge wire ( 20amp @ 240v )
Lets say 20M distance from the DB to the plug where the computer is.
Our calculator says
Voltage drop: 8.34v
Voltage drop percentage: 69.50%
Voltage at the end: 3.66v
Thats insufficient to run our 12v device. Waaaay insufficient.
Calculator for that example
Please feel free to run some calculations and learn something about what we're all trying to explain to you.
Now anyone with half a brain please help this guy out! I tried...
They range from 200W to 1000VOh those ones. I did not think you would have wanted me to include those in the calcs but lets do it then.
Using your calculator. Using the example of the 2.5mm2 15-20 amp standard home wiring with approximate 18.2 meter distance between source and destination of my 500W PC.
Thus 500/12=41.7 amps. Thus the calc say we need:
2AWG 33.631mm2 (7.558mm Diameter) wire required. Wow.
Now where in your proposed design does the 2.5mm2 standard home wire rated for 15Amps feature.
How are we going to get the require 7.6 mm dia wire through existing conduit
At R2000.00 - R3000.00 per pop what will the total cost be for one single PC. where are the rest of the required home appliances?
I do not think you thought your proposed design through did you? Just twaddled along selling BS information as you tagged along with any excuse popping in your head.
But this gem one took the top prize
Lightning is a very unpredictable thing. It can strike on the opposite end of the house, travel all the way to the PC wall plug, through the PSU and only damage the network controller.What!!!!!!!
Hehe. I do not know where you studied electronics. Very selective electrons....hmmm.
Last edited:

