I’m not able to reproduce this at all. I’m seeing the entire operation take about 7.5-8.5s, .7-.8ms per transaction. Here’s the entire script I’m running:
import psycopg2
import time
import uuid
conn = psycopg2.connect(user='root', host='localhost', port=26257)
conn.set_session(autocommit=True)
cur = conn.cursor()
cur.execute("CREATE DATABASE IF NOT EXISTS test")
cur.execute('''
DROP TABLE IF EXISTS test.transaction;
CREATE TABLE IF NOT EXISTS test.transaction (
id STRING(255) PRIMARY KEY,
address STRING(255),
arrival_time BIGINT NOT NULL,
bundle STRING(255),
current_index BIGINT NOT NULL,
hash STRING(255),
last_index BIGINT NOT NULL,
last_modified_date_millis BIGINT NOT NULL,
tag STRING(255),
timestamp BIGINT NOT NULL,
value REAL NOT NULL,
INDEX idx_transaction_address (id, address),
INDEX idx_transaction_bundle (id, bundle)
)
''')
start = time.time()
print "Start time was {0}".format(start)
for i in range(1,10000):
id = str(uuid.uuid4()) # For testing UUID performance
#id = str(i) # For testing serial integers and testing for transaction contention.
cur.execute('''
UPSERT INTO test.transaction
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)''',
(id, 'abc', 123, 'abc', 123, 'abc', 123, 123, 'abc', 123, 123.456))
end= time.time()
print "End time was {0}. Total time was {1}. Average time was {2}".format(end, end - start, (end - start)/10000)
Could you try executing that and see if it delivers similar results for you? That’ll at least tell us whether the latency is caused by the application, or something about the node itself.