Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Database cant acces data after insert a field???

Hello, I'ven been programming with PHP many time and this is one of the weirds things i've ever seen.

I have a big Database and a website replying this data taken with PHP via html form.

I have been registering data into the database 10 months straight, 10.000+ registers

Everything was fine till yesterday....if i insert any data on a specific table, no matter what, the next time i try to display the data on the website it displays an error ( else{} sentence)

if i remove the previous data, everything works again, my head its gonna explode looking for a solution, maybe its the easiest thing, im blind i cant see it.

The code works perfect since if i dont add anymore data, the website and the php acces works 100%

is there any data usage restriccions? that table have more than 6.000 fields

any help, suggestions, greatly appreciated ^^ thanks!!!

Comments

  • @Snakone said:
    Hello, I'ven been programming with PHP many time and this is one of the weirds things i've ever seen.

    I have a big Database and a website replying this data taken with PHP via html form.

    I have been registering data into the database 10 months straight, 10.000+ registers

    Everything was fine till yesterday....if i insert any data on a specific table, no matter what, the next time i try to display the data on the website it displays an error ( else{} sentence)

    if i remove the previous data, everything works again, my head its gonna explode looking for a solution, maybe its the easiest thing, im blind i cant see it.

    The code works perfect since if i dont add anymore data, the website and the php acces works 100%

    is there any data usage restriccions? that table have more than 6.000 fields

    any help, suggestions, greatly appreciated ^^ thanks!!!

    Just to clarify, what do you mean by fields? Does your database have 6000 columns or 6000 rows. 6000 rows should be fine, but 6000 columns will likely run you into the 64 MB row size limit inherent in MySQL.

    You say you get a PHP error. Can you retrieve the error message from the MySQL query in your code, to rule out a code issue and figure out why the query is rejected?

  • oh yea rows, my bad :dizzy:

    theres is no sql query error apparently, i've do it manully on phpmyadmin and it works

    if ($row = mysql_fetch_array($result))

    {code}

    else {what I really get}

  • edited March 26

    oh, i found this after adding:

    error_reporting(E_ALL);
    ini_set('display_errors', 1);

    Warning: mysql_fetch_array() expects parameter 1 to be resource, boolean given in /home/vol2_2

    EDIT: i have the SQL query of $result and I do it manually on phpmyadmin and i dont get this error

  • @Snakone said:
    oh, i found this after adding:

    error_reporting(E_ALL);
    ini_set('display_errors', 1);

    Warning: mysql_fetch_array() expects parameter 1 to be resource, boolean given in /home/vol2_2

    EDIT: i have the SQL query of $result and I do it manually on phpmyadmin and i dont get this error

    That error suggests $result is FALSE, which usually happens when the query failed. Can you check with var_dump(mysql_error($dbConnection)); what the error message from the connection is?

  • edited March 26

    The SELECT would examine more than MAX_JOIN_SIZE rows; check your WHERE and use SET SQL_BIG_SELECTS=1 or SET MAX_JOIN_SIZE=# if the SELECT is okay

    I knew it, was some max size problem apparently???

  • @Snakone said:
    The SELECT would examine more than MAX_JOIN_SIZE rows; check your WHERE and use SET SQL_BIG_SELECTS=1 or SET MAX_JOIN_SIZE=# if the SELECT is okay

    I knew it, was some max size problem apparently???

    MAX_JOIN_SIZE is a setting which limits how many rows can be compared in a single SELECT query across multiple tables. This limit is set to 4 million rows on InfinityFree. That means that your query would check over 4 million rows! Running queries that big can put substantial load on the database servers.

    So, I suppose your could call it a max size problem. However, the problem is not the size of the database, but the number of rows in the query. You can have multiple tables with 10 million rows each without any problems, just not when you join all rows in those tables and return them in one go.

    If your script does require you to pull over 4 million result rows, you may want to use LIMIT and OFFSET clauses to limit the number of results per query.

    If your script does not retrieve that many rows, some of your query clauses may inadvertently be scanning the entire table. If that's the case, you can use EXPLAIN queries to figure out which tables are being scanned and either restructure your query or add indexes to the table to optimize it.

  • edited March 26

    at a first step everything goes fine!!! many thanks for your support :)

    I have one table with 600 rows and i join it with another with 6000

    what do you think it could be a good LIMIT to set?

    EDIT: btw im really happy with your service so far, bandswich, speed, support, php, no ads.. etc..

    If my project goes well like it does by now, soon im thinking to upgrade my account, i need a SSL certificate, never did something like that before, but you guys rocks!

  • somehow i cant reedit the post, sorry about that.

    i got this when EXPLAIN the query:

  • edited March 26

    @Snakone said:
    at a first step everything goes fine!!! many thanks for your support :)

    I have one table with 600 rows and i join it with another with 6000

    what do you think it could be a good LIMIT to set?

    EDIT: btw im really happy with your service so far, bandswich, speed, support, php, no ads.. etc..

    If my project goes well like it does by now, soon im thinking to upgrade my account, i need a SSL certificate, never did something like that before, but you guys rocks!

    I don't really know what a good LIMIT size would be. You could start with 1000 rows and increase or decrease it from there as necessary.

    @Snakone said:
    somehow i cant reedit the post, sorry about that.

    i got this when EXPLAIN the query:

    That doesn't really tell that much without knowing the corresponding query text or database schema.

    You may want to search the web for guides on how to make sense of the EXPLAIN results and use it to optimize your query. This is an article I found which looks pretty detailed: https://www.sitepoint.com/using-explain-to-write-better-mysql-queries/

    But you can see in the results is that just over 6000 rows are checked, which is nowhere near the 4 million rows that caused the error. Are you sure you are running the exact same query with the same parameters as what your script crashed on before?

  • edited March 27

    yes its the same! it was solved just putting LIMIT at the end of the query

    yes it have 6000 rows but with WHERE clause, result max rows is 30 or 40, anyway the server does the whole inner join before passing WHERE right? or thats what i remember :D

  • @Snakone said:
    yes its the same! it was solved just putting LIMIT at the end of the query

    So in your script you only get 30-40 results but it fails if you don't add a LIMIT clause which is far, far higher? That's very odd.

    @Snakone said:
    yes it have 6000 rows but with WHERE clause, result max rows is 30 or 40, anyway the server does the whole inner join before passing WHERE right? or thats what i remember :D

    Well, the EXPLAIN results say that the 5633 row select is "Using where; Using join buffer", which I'd say confirms what you remember.

    However, scanning 6000 rows for only a few dozen results does underline you clearly need to add some good indexes to your table.

This discussion has been closed.