#StackBounty: #php #arrays #json #laravel Reading big arrays from big json file in php

Bounty: 500

I know my question has a lot of answers on the internet but it’s seems i can’t find a good answer for it, so i will try to explain what i have and hope for the best,

so what i’m trying to do is reading a big json file that might be has more complex structure "nested objects with big arrays" than this but for simple example:

{
  "data": {
    "time": [
      1,
      2,
      3,
      4,
      5,
       ...
    ],
    "values": [
      1,
      2,
      3,
      4,
      6,
       ...
    ]
  }
}

this file might be 200M or more, and i’m using file_get_contents() and json_decode() to read the data from the file,

then i put the result in variable and loop over the time and take the time value with the current index to get the corresponding value by index form the values array, then save the time and the value in the database but this taking so much CPU and Memory, is their a better way to do this

a better functions to use, a better json structure to use, or maybe a better data format than json to do this

my code:

$data = json_decode(file_get_contents(storage_path("test/ts/ts_big_data.json")), true);
        
foreach(data["time"] as $timeIndex => timeValue) {
    saveInDataBase(timeValue, data["values"][timeIndex])
}

thanks in advance for any help

Update 06/29/2020:

i have another more complex json structure example

{
      "data": {
        "set_1": {
          "sub_set_1": {
            "info_1": {
              "details_1": {
                "data_1": [1,2,3,4,5,...],
                "data_2": [1,2,3,4,5,...],
                "data_3": [1,2,3,4,5,...],
                "data_4": [1,2,3,4,5,...],
                "data_5": 10254552
              },
              "details_2": [
                [1,2,3,4,5,...],
                [1,2,3,4,5,...],
                [1,2,3,4,5,...],
              ]
            },
            "info_2": {
              "details_1": {
                "data_1": {
                  "arr_1": [1,2,3,4,5,...],
                  "arr_2": [1,2,3,4,5,...]
                },
                "data_2": {
                 "arr_1": [1,2,3,4,5,...],
                  "arr_2": [1,2,3,4,5,...]
                },
                "data_5": {
                  "text": "some text"
                }
              },
              "details_2": [1,2,3,4,5,...]
            }
          }, ...
        }, ...
      }
    } 

the file size might be around 500MB or More and the arrays inside this json file might have around 100MB of data or more.

and my question how can i get any peace and navigate between nodes of this data with the most efficient way that will not take much RAM and CPU, i can’t read the file line by line because i need to get any peace of data when i have to,

is python for example more suitable for handling this big data with more efficient than php ?

please if you can provide a detailed answer i think it will be much help for every one that looking to do this big data stuff with php.


Get this bounty!!!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.