EDIT
The OP admitted a PostgreSQL profiling error in its answer below. I am updating this question to reflect the comparison between MyISAM and InnoDB.
Hello,
I ran a test against MySQL InnoDB, MyISAM, and PostgreSQL to see how well each of these engines performed a full table scan to understand what response times might be when we inevitably need this to happen.
Tests were conducted on Intel Core 2 Quad Q6600 @ 2.4Ghz w / 4 GB of RAM and 7200 rpm HD with a cache of 16 MB.
MySQL version was 5.0.67-community-nt-log 32-bit, PGSQL version was 8.4.
I wrote a small script to create 5 million rows of data in a four-column table. These are the create table statements used in MySQL and PGSQL:
- InnoDB
CREATE TABLE sample_innodb (
id integer unsigned not null,
vc1 varchar(200) not null,
vc2 varchar(200) not null,
vc3 varchar(200) not null
) ENGINE=InnoDB;
- MyISAM
CREATE TABLE sample_isam (
id integer unsigned not null,
vc1 varchar(200) not null,
vc2 varchar(200) not null,
vc3 varchar(200) not null
) ENGINE=MyISAM;
- PostgreSQL
create table sample_pgsql (
id integer not null,
vc1 varchar(200) not null,
vc2 varchar(200) not null,
vc3 varchar(200) not null
);
This is the script that I used to create data for these tables:
var chars = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXTZabcdefghiklmnopqrstuvwxyz'.split('');
function randomString(length) {
var str = '';
for (var i = 0; i < length; i++) {
str += chars[Math.floor(Math.random() * chars.length)];
}
return str;
}
function genrow(idv, vcv1, vcv2, vcv3) {
return idv + "," + vcv1 + "," + vcv2 + "," + vcv3;
}
function gentable(numrows) {
for (var i = 0; i < numrows; i++) {
var row =
genrow(i,
randomString(10),
randomString(20),
randomString(30));
WScript.Echo(row);
}
}
gentable(5000000);
I ran this script on Windows using the command:
cscript.exe /nologo test.js > data.csv
You can load this data into MySQL with the following commands:
LOAD DATA LOCAL INFILE 'data.csv'
INTO TABLE sample_innodb
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(id, vc1, vc2, vc3);
LOAD DATA LOCAL INFILE 'data.csv'
INTO TABLE sample_isam
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(id, vc1, vc2, vc3);
You can load data into PGSQL with this command:
copy sample_pgsql (id, vc1, vc2, vc3) from 'data.csv' with delimiter ','
I used this query for synchronization to try to apply the worst case table scan script:
MySQLselect count(*) from [table]
where vc1 like '%blah0%' and vc2 like '%blah1%' and vc3 like '%blah2%';
PostgreSQLselect count(*) from [table]
where vc1 ilike '%blah0%' and vc2 ilike '%blah1%' and vc3 ilike '%blah2%';
I ran this query several times to get the average time to complete, leaving from the first run so that everything loads into memory.
The results were as follows:
- InnoDB - 8.56s
- MyISAM - 1.84s
- PGSQL - 8.4s
Question
, InnoDB MyISAM , ? - MySQL? MySQL , ", ".
, , ,
.
, MySQL PGSQL:
MYSQL CONFIG
[client]
port=3306
[mysql]
default-character-set=utf8
[mysqld]
port=3306
basedir="C:/Program Files/MySQL/MySQL Server 5.0/"
datadir="C:/Program Files/MySQL/MySQL Server 5.0/Data/"
default-character-set=utf8
default-storage-engine=INNODB
log="c:/logs/mysql/mysqld.log"
sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION"
max_connections=700
query_cache_size=0M
table_cache=1400
tmp_table_size=16M
thread_cache_size=34
myisam_max_sort_file_size=100G
myisam_sort_buffer_size=8M
key_buffer_size=200M
read_buffer_size=64K
read_rnd_buffer_size=256K
sort_buffer_size=208K
innodb_additional_mem_pool_size=2M
innodb_flush_log_at_trx_commit=1
innodb_log_buffer_size=1M
innodb_buffer_pool_size=200M
innodb_log_file_size=18M
innodb_thread_concurrency=10
PGSQL CONFIG
listen_addresses = '*'
port = 5432
max_connections = 100
shared_buffers = 32MB
temp_buffers = 12MB
maintenance_work_mem = 32MB
log_destination = 'stderr'
logging_collector = on
log_line_prefix = '%t'
datestyle = 'iso, mdy'
lc_messages = 'English_United States.1252'
lc_monetary = 'English_United States.1252'
lc_numeric = 'English_United States.1252'
lc_time = 'English_United States.1252'
default_text_search_config = 'pg_catalog.english'
, , MySQL,
\G , :
*************************** 1. row ***************************
Name: sample_innodb
Engine: InnoDB
Version: 10
Row_format: Compact
Rows: 5000205
Avg_row_length: 100
Data_length: 500154368
Max_data_length: 0
Index_length: 149700608
Data_free: 0
Auto_increment: NULL
Create_time: 2010-02-02 17:27:50
Update_time: NULL
Check_time: NULL
Collation: utf8_general_ci
Checksum: NULL
Create_options:
Comment: InnoDB free: 497664 kB
*************************** 2. row ***************************
Name: sample_isam
Engine: MyISAM
Version: 10
Row_format: Dynamic
Rows: 5000000
Avg_row_length: 72
Data_length: 360006508
Max_data_length: 281474976710655
Index_length: 1024
Data_free: 0
Auto_increment: NULL
Create_time: 2010-02-02 17:27:50
Update_time: 2010-02-02 17:37:23
Check_time: NULL
Collation: utf8_general_ci
Checksum: NULL
Create_options:
Comment: