Flat file DB guidance

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
With the new gorgeous design of mybroadband I decided its time to do some renovations of my own.

I have a small Stock ticker that I use, issue is when I have a lot of them then the process becomes quite slow also if the API is down I need a fallback, thus I will have to store the results in a database the website in question does not have a database so I am thinking of going with a flat file however what happens with simultaneous connections, the website averages on about 4 requests per second.

So my question what would be the best approach to take this to a flat file db system.

My thinking was to setup a cronjob to store the infoObject in the flat file and then each user that visits the site simply reads from the flat file so then it should work if 10 users open the site at the exact same time correct?

PHP:

PHP:
 <?php
  $marketsArray = array(
      "JSE:DBXWD"=>array("name"=>"Db x-trackers Col in Wld"),
      "JSE:TAS"=>array("name"=>"Taste Holdings"),
      "JSE:STXIND"=>array("name"=>"Satrix Industrial 25"),
      "JSE:CCO"=>array("name"=>"Capital & Counties Properties PLC"),
      "JSE:WHL"=>array("name"=>"Woolworths Holdings Limited"),
      "JSE:SHP"=>array("name"=>"Shoprite Holdings Ltd"),
      "JSE:RLF"=>array("name"=>"Rolfes Holdings")
  );

  //this isn't the most accurate as it's only taking into account 2 data points.

  foreach($marketsArray as $key => $val){
      //get url for this symbol
      $infoUrl = 'http://www.google.com/finance/info?client=ig&q='.$key;
      //get contents, remove //
      $infoObj = str_replace('//','',file_get_contents($infoUrl));
      if($infoObj){
          //clean control chars and convert to asc array
          $infoObj = json_decode(utf8_encode($infoObj),true);

          /*"id": "338568" - internal google security id
          ,"t" : "CCO" - stock symbol
          ,"e" : "JSE" - exchange name
          ,"l" : "19.72" - last trade price
          ,"l_fix" : "19.72" - last trade ?
          ,"l_cur" : "19.72" - last trade with currency
          ,"s": "0" - last trade size
          ,"ltt":"3:59PM EDT" - last trade time
          ,"lt" : "December 13, 3:59PM EDT"  - last trade date time long
          ,"lt_dts" : "2016-12-13T15:59:59Z" - last trade date time
          ,"c" : "-0.31" - change
          ,"c_fix" : "-0.31" - ? 
          ,"cp" : "-1.55" - ? percentage
          ,"cp_fix" : "-1.55" - ? 
          ,"ccol" : "chr" - ? 
          ,"pcls_fix" : "20.03" - previous close price
          */

          $stock = $infoObj[0]['t'];
          $exchange = $infoObj[0]['e'];
          $code = $stock.':'.$exchange;
          $cPercentage = $infoObj[0]['cp'];//this is the percentage, which is basically just comparing the change between the end of the last day, and the most recent transaction
          $cAmount = $infoObj[0]['c'];
          $prevClosePrice = floatval(str_replace(',','',$infoObj[0]['pcls_fix']));
          $lastTradePrice = floatval(str_replace(',','',$infoObj[0]['l']));
          $lastTradePriceCur = $infoObj[0]['l_cur'];
          $lastTradeDate = $infoObj[0]['lt'];

          //mean
          $mean = ($prevClosePrice+$lastTradePrice)/2;

          //subtract mean from each, and square
          $newPrev = ($prevClosePrice-$mean)*($prevClosePrice-$mean);
          $newLast = ($lastTradePrice-$mean)*($lastTradePrice-$mean);

          //sum/devide by 1 (so nothing) and sqrt
          $final = round(sqrt($newPrev+$newLast),2);

          $ticker = "<li><a href=\"https://www.google.com/finance?q=$code\">";
          $ticker .= '<b>'.$val['name'].'</b>';
          $ticker .= " ";
          $ticker .= "$code";
          $ticker .= " ";
          $ticker .= "$lastTradePriceCur";
          $ticker .= " ";
          $ticker .= "$cAmount";
          $ticker .= " ";
          $ticker .= "$cPercentage";
          $ticker .= '</a></li>';
        
          echo $ticker;


      }
  }
  ?>

Also rep is welcome if you found this piece of code useful.
 
Last edited:

scudsucker

Executive Member
Joined
Oct 16, 2006
Messages
9,024
You'll get rep when you implement a cache; your 'flat file' system may work but it is rubbish.

Consider a case where you have multiple servers running the same code/cron (a Web farm) when two or more try to write to that file at the same time ****s going down. I don't actually see your code writing to the "flat file" but I guess you left that part out? If not then what you have created is a VERY SLOW API, not a flat file DB.

Oh... and feel free to Google "separation of concerns" and the "MVC pattern"
 
Last edited:

Hamster

Resident Rodent
Joined
Aug 22, 2006
Messages
42,920
With douchebaggy comments now done, he has a point.

Cache your data. This is a case of where the mechanics you implemented works and is fun to write but completely inefficient.

What I'd do on receiving new data is persist it to disk (whatever format) but also update the memory data at the same time. The data on disk us read only once at startup to get the site going.

Well, that's one way.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
You'll get rep when you implement a cache; your 'flat file' system may work but it is rubbish.

Consider a case where you have multiple servers running the same code/cron (a Web farm) when two or more try to write to that file at the same time ****s going down. I don't actually see your code writing to the "flat file" but I guess you left that part out? If not then what you have created is a VERY SLOW API, not a flat file DB.

Oh... and feel free to Google "separation of concerns" and the "MVC pattern"

Touche, but this will never ever be the case there will be no web farm there will be me in charge of my cronjob and that is it.

As pointed out, I do not have access to a DB, the next best thing was Flat file, cache will not be an option as the google api can be down and then what, the db will act as a fallback as well and then it is only one cron job that will write to it so I do not see any concern there and then all visitors will strictly read from that file so if my understanding is correct then each visitor gets served a copy of the flatfile so should a cronjob run and the user access the page at the exact same time then the user will be send a copy of the master and the cronjob will be working on a different copy of master and once done will update the master and the next split second visitor will see the updated master copy?
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
With douchebaggy comments now done, he has a point.

Cache your data. This is a case of where the mechanics you implemented works and is fun to write but completely inefficient.

What I'd do on receiving new data is persist it to disk (whatever format) but also update the memory data at the same time. The data on disk us read only once at startup to get the site going.

Well, that's one way.

Okey that makes sense, I always thought about cache as temporary thing that the user can clean and then break my site if I relied on it.
 

Hamster

Resident Rodent
Joined
Aug 22, 2006
Messages
42,920
Okey that makes sense, I always thought about cache as temporary thing that the user can clean and then break my site if I relied on it.
Memory is fast and you should use it.

Also think of the "effort" involved in reading from file every time, the locking mechanisms involved etc. vs just reading the data already stored in memory. Your cloud provider will charge you less doing the latter
 
Last edited:

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Memory is fast and you should use it.

Also think of the "effort" involved in reading from file every time, the locking mechanisms involved etc. vs just reading the data already stored in memory. Your cloud provider will charge you less doing the latter
I'm reading up on php ram as I type.

Will report back.
 

SauRoNZA

Honorary Master
Joined
Jul 6, 2010
Messages
47,847
Why on earth would you print out something that is perfectly searchable and beyond that copy & pasteable?

It's pretty much going backwards.
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
Why on earth would you print out something that is perfectly searchable and beyond that copy & pasteable?

It's pretty much going backwards.
Don't worry about why. (I have my reasons and the bigger picture will be clear later although it's of no importance whatsoever :)


Right seems this should do the trick

$pdo = new PDO('sqlite::memory:');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);

How long does this in memory database exists? What will cause it to be destroyed, I need to do quite a bit of selects and also update it every 10 minutes when the cron runs.
 

Hamster

Resident Rodent
Joined
Aug 22, 2006
Messages
42,920
Why on earth would you print out something that is perfectly searchable and beyond that copy & pasteable?

It's pretty much going backwards.
/non-developer tendencies detected

Like most things devs do in their spare time: because they can. One day he'll need to do something similar with data that is not easily searchable.
 

Hamster

Resident Rodent
Joined
Aug 22, 2006
Messages
42,920
How long does this in memory database exists? What will cause it to be destroyed,
The instance losing scope
You disposing of it programmatically

Basically the life cycle of the app.
 

SauRoNZA

Honorary Master
Joined
Jul 6, 2010
Messages
47,847
/non-developer tendencies detected

Like most things devs do in their spare time: because they can. One day he'll need to do something similar with data that is not easily searchable.

Lol @ non-developer tendencies detected.

I've just found any kind of source material in paper form to be highly restrictive for years now.

I do take manual notes though. ;)
 

Thor

Honorary Master
Joined
Jun 5, 2014
Messages
44,236
The instance losing scope
You disposing of it programmatically

Basically the life cycle of the app.

slight mind fck this running it in memory as I do not have a physical file I can open up and view, it works, but fcks with my brain at this point, perhaps I should just use a normal Sqlite file that should work as I am hosted on SSDs
 

IndigoIdentity

Expert Member
Joined
May 10, 2010
Messages
1,964
You have clients that are each invoking a request to some external API and the server is dealing with each of these requests. You can force the client to make the request to the Google API if that is the case, can you not?

Take a look here too: https://codelabs.developers.google.com/codelabs/lovefield/index.html?index=../../index#0

Thing is, i'm unsure of how this would work if it even does when it comes to multiple users accessing the data, I figure it to be something more like a cookie but I may be wrong? It still seems like it would bring the benefits in terms of offline data storage and also in that link it's dealing with monitoring stock data so is applicable...

However, without creating a client side application and working with what you have got, you'd still need to cache the data on the server side...

So, instead of the website querying the Google API directly for each request, implement a cache for the results so that when your server gets 10 clients all making the same query at once, it will fetch and store the data once then realise that the other 9 are the exact same thing so will return the cached result instead of querying the Google API again?

Where does it store that data? Who cares, sqlite is fine if its just a small thing but dont query the Google API 100 times is my point... I think that is the problem that you're facing.

That sounds to me like a scalable solution, you could have many users making use of this and its not going to get slower and slower each time someone else joins in?
 
Last edited:
Top