file_get_contents performance

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
This looks very easy ie straightforward, just want to confirm when this actually happens.


Ie at the moment on my index page I have that code of mine as a php include so everytime a visitor lands on the page it will do this:

So when does the put it into a dB happen and how does the percentage changed gets displayed?

Although I guess the percentage changed can be added in the dB as well.

Thinking of this now a cron job seems perfect for this if the cron job updates the dB every 10 minutes and then visitors gets displayed the latest dB entries.

Am I looking at it correctly?

About it, websites should only pull info from a local db so page generation times can be kept to micro seconds, anything that takes execution time(server resources/long response time) should be back end/automated.

Parcelcheck(one of my first projects) basically ran a cron against peoples inputted tracking number/email and would store response/check if updated and send email.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Mmmm @KoeksGHT very interesting from both you and @Bio I see now what additional roles a db plays. Never thought of it from a performance page load perspective until now.

So basically any user input must be DB first and then if you need to mail you make a new db connection get the contents and mail it ( If I look at a contact page)

But we can tackle the contact page later.

This is priority one.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
[)roi(];18114223 said:
Yeah that happens a lot... any guess why @rpm hasn't addressed it.

I think that is cloudflare certain keywords kicks it into hyperdrive injection protection I suppose
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
What breaks my brain now is the loop part.

Currently it's a loop

If I add it to the DB then I will only have one company.

Not sure if I make sense here, bare with me.

Can I simply run the DB insert as a for each loop as well?
 

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
What breaks my brain now is the loop part.

Currently it's a loop

If I add it to the DB then I will only have one company.

Not sure if I make sense here, bare with me.

Can I simply run the DB insert as a for each loop as well?

just run the insert in the for loop so it inserts every record with company, price, date, %
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
About it, websites should only pull info from a local db so page generation times can be kept to micro seconds, anything that takes execution time(server resources/long response time) should be back end/automated.
This...

Never thought of it from a performance page load perspective until now.
Please do... also question the need for up to the minute / second; in most cases a "data/stats as of <timestamp>" tag bridges that.
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
What breaks my brain now is the loop part.

Currently it's a loop

If I add it to the DB then I will only have one company.

Not sure if I make sense here, bare with me.

Can I simply run the DB insert as a for each loop as well?
Yes... in a loop; but a more efficient solution is constructing a single insert for all. i.e. loop creates insert statement; which is executed after the loop completes.
 

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
[)roi(];18114323 said:
Yes... in a loop; but a more efficient solution is constructing a single insert for all. i.e. loop creates insert statement; which is executed after the loop completes.

^ this if a microtime doesn't matter with processing the data.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Okey, let's tone it down a small bit. Again I'm really excited and probably most likely not thinking it all the way through.

So the insert it into a dB happens inside that for loop. Okey that I get. That I can accomplish.

Now I assume that means will have

A table called stock_history

And 4 columns
Company name
Share price
Percentage change
Date modified

If I use the for loop then there will be x amount of data each time what I'm trying to say assuming my for loop is for 8 stocks

Then my dB will be
Company 1
Company 2
Company 3
Company n

And all the other values next to it.

Now how do I fetch those different companies because of I only fetch the last modified it will only fetch company 8?

Or should I share_count = count(array of shares)

The in the fetch, do fetch last $share_count of companies in order to make it easy if I want to add more companies to the array later on?

I'm not sure of I am making any sense here.

If I must sum it up easier basically I'm asking how would the function to fetch the companies from the dB work?

(sorry for the noob questions guys, I really do appreciate the help)
 

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
loop through company db with all companies, inside the loop do a select passing the company ID in the sql "SELECT * from history where company=".$row['company']." limit 1", build the green blocks dynamically using the company db loop, add new company, automatically adds to interface

$res = mysql_query("SELECT * FROM companies");

while($row=mysql_fetch_array($res))
{
$companyres = mysql_query("SELECT * FROM history where companies=".$row['id']);
$company = mysql_fetch_array($companyres);
echo "<div>";
echo $row['name'];
echo $company['percentage'];
echo "</div>";

}
 
Last edited:

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Whoosh. That makes sense in my brain sort of.

Dammit this is an exciting world!!!

I'm a design marketing person this is all new, so this to me omg love it. Why have I not taken this serious sooner.

I love all of you.
 

koeksGHT

Dealer
Joined
Aug 5, 2011
Messages
11,857
Whoosh. That makes sense in my brain sort of.

Dammit this is an exciting world!!!

I'm a design marketing person this is all new, so this to me omg love it. Why have I not taken this serious sooner.

I love all of you.

Gets easier when it clicks upstairs.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Oh lord, I need some help.

I created the databse and in the table I have a column called time and it is set as a DATETIME format.

my php is

PHP:
$modified_time = date("H:i:s");

and when I echo $modified_time then I get 11:58:12

However in the database it just stays 0000-00-00

Judging by that I assume my data() format is not correct

So question is what format should I use here considering the time modified will determine what will be displayed as we want to select that last few modified stocks.

This is what I have: (ignored the echo at the bottom that was just me testing quickly)
2016-08-09_12-03-31.jpg

DB
2016-08-09_11-53-26.jpg

result (OMG IT WORKS!!!!)
2016-08-09_12-10-32.png

I changed it to
PHP:
$modified_time = date("Y-m-d H:i:s");
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Omg I even figured out the cron job business.

/pours some vodka for self and everyone.

Right this is the easy part, now to understand how to get these results back keeping it mind I might add more stocks, I don't think I understand what to use a identifier, time seems obvious, but that script takes close to 5 seconds to run (thanks api url) so even in one cron job the date differs a few seconds between the first loop and the last and that gap will increase as you add more stocks.

Should I be concerned about the size of this DB how does size affect speed as I suspect having 8 rows added every 5 minutes which this job will run 288 a day and for a month would mean about 69120 rows.... Quite a big db after a few months...?

2016-08-09_12-14-13.png
 
Last edited:

rward

Senior Member
Joined
Oct 26, 2007
Messages
865
Omg I even figured out the cron job business.

/pours some vodka for self and everyone.

Right this is the easy part, now to understand how to get these results back keeping it mind I might add more stocks, I don't think I understand what to use a identifier, time seems obvious, but that script takes close to 5 seconds to run (thanks api url) so even in one cron job the date differs a few seconds between the first loop and the last and that gap will increase as you add more stocks.

Should I be concerned about the size of this DB how does size affect speed as I suspect having 8 rows added every 5 minutes which this job will run 288 a day and for a month would mean about 69120 rows.... Quite a big db after a few months...?

View attachment 380997

That's why you're using a db. Worry about it when you get to hundreds of millions and are doing complex joins
 

gkm

Expert Member
Joined
May 10, 2005
Messages
1,519
You can add another daily cron that:
- Deletes everything more than a month old or
- Just keep one value per day for those older than a week
- etc. whatever suits your system.

I agree with rward that a couple of 100 thousand rows is usually not a problem, but it is always a good practice to prune data from the beginning, so that you system does not get a little bit slower every day until it grids to a halt some day when there is so much data that it is hard to delete it without hours of downtime.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
You can add another daily cron that:
- Deletes everything more than a month old or
- Just keep one value per day for those older than a week
- etc. whatever suits your system.

I agree with rward that a couple of 100 thousand rows is usually not a problem, but it is always a good practice to prune data from the beginning, so that you system does not get a little bit slower every day until it grids to a halt some day when there is so much data that it is hard to delete it without hours of downtime.

That was my thinking I don't like the it's not a problem wait till you get to million kind of answers and it instills a bad practice imo.

This what you suggested, I understand and think I will set up a cron job to run every sunday that deletes everything of the week before as I do not actually need the data I am just doing this so that I can show the stock prices to visitors without having the page to take 5 seconds to load.
 
Top