file_get_contents performance

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
I am trying to debug/investigate/determine where the "drainage" is in my code.

I have a for each loop which uses

PHP:
                  $json = file_get_contents($url);
                  $json = json_decode($json);

Would that be a such a big drain on performance or might it be the actual API url?

I tested my code block with this:

PHP:
<?php

$startTime = microtime(true);

// my code

echo "Time:  " . number_format(( microtime(true) - $startTime), 4) . " Seconds\n";

?>

and well

The result is quite slow.

Between Time: 3.7666 Seconds and Time: 5.1338 Seconds
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
Obviously if $url is remote; then it's out of your control.

The other thing to consider is how often the content that you're processing changes; if it's not realtime, then how about local caching the processed result (i.e. the result of your loop)?
 

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
API url response time, just stick logic required in a cron script if it takes long to load.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
So

$stocks array

For each stocks as stock

API url
www.api.com/$stock:jse
 

biometrics

Honorary Master
Joined
Aug 7, 2003
Messages
71,858
Thanks for the responses so far.

What I am doing is this:

http://mybroadband.co.za/vb/showthread.php?t=834257

Basically I have a for each loop using the Bloomberg API to get the stock prices so currently it loads/runs everytime the page gets visited.
Yeah that's terribly inefficient. Cron job + store in db is a better approach. If you don't want a cron job then add a timestamp to the db record and only fetch it every minute or so.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Yeah that's terribly inefficient. Cron job + store in db is a better approach. If you don't want a cron job then add a timestamp to the db record and only fetch it every minute or so.

Thats a new concept (cron jobs)

I'll need to study up ie how to actually write a cron job since I know what it is. Yea what your saying makes a lot more sense.

So poll the API for new prices every 5 minutes and then fetch the latest details from the dB everytime a user lands on the page?

I will send a image of my code in a moment. I'm on the phone.
 

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
Thats a new concept (cron jobs)

I'll need to study up ie how to actually write a cron job since I know what it is. Yea what your saying makes a lot more sense.

So poll the API for new prices every 5 minutes and then fetch the latest details from the dB everytime a user lands on the page?

I will send a image of my code in a moment. I'm on the phone.

a cron is just a normal script file simply run from "crontab -e" with that bit in pastebin
 

biometrics

Honorary Master
Joined
Aug 7, 2003
Messages
71,858
Thats a new concept (cron jobs)

I'll need to study up ie how to actually write a cron job since I know what it is. Yea what your saying makes a lot more sense.

So poll the API for new prices every 5 minutes and then fetch the latest details from the dB everytime a user lands on the page?

I will send a image of my code in a moment. I'm on the phone.
Cron job is one approach but not a must.

Probably easier for you going with my second suggestion, time stamp the db record and check how much time has passed and then refresh.
 

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
Cron job is one approach but not a must.

Probably easier for you going with my second suggestion, time stamp the db record and check how much time has passed and then refresh.

It's nicer to cron and timestamp everything to make graphs
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
I appreciate the guidance I am going to make it open source when I'm done for now I am just trying to learn in the process.

This is my code.
 

Attachments

  • 1470671800598.jpg
    1470671800598.jpg
    31.9 KB · Views: 49

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
Just insert into db

mysql_query(INSERT INTO history (`stock`, `price`, `dateadded`) VALUES ($company, $price, NOW()));

Then on that page select from db and display
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Just insert into db

mysql_query(INSERT INTO history (`stock`, `price`, `dateadded`) VALUES ($company, $price, NOW()));

Then on that page select from db and display

This looks very easy ie straightforward, just want to confirm when this actually happens.


Ie at the moment on my index page I have that code of mine as a php include so everytime a visitor lands on the page it will execute.

So when does the put it into a dB happen and how does the percentage changed gets displayed?

Although I guess the percentage changed can be added in the dB as well.

Thinking of this now a cron job seems perfect for this if the cron job updates the dB every 10 minutes and then visitors gets displayed the latest dB entries.

Am I looking at it correctly?
 

Attachments

  • 1470672737452.jpg
    1470672737452.jpg
    26.6 KB · Views: 45
Top