Compare commits

..

59 Commits

Author SHA1 Message Date
ea4efc5a2b Updated server code. 2018-06-07 21:36:06 +02:00
26c0b469eb restore restore_model 2018-05-22 20:49:10 +02:00
f170bad9b1 tesauro fat and diffs in values 2018-05-22 15:39:14 +02:00
6e061171da rm TODO 2018-05-22 15:38:04 +02:00
40c228ef01 pubeval tests 2018-05-22 15:36:23 +02:00
c2c6c89e9f Merge branch 'experimentation' into 'master'
Experimentation

See merge request Pownie/backgammon!8
2018-05-22 13:16:10 +00:00
b7708b3675 train-evaluate-save 2018-05-22 15:15:36 +02:00
bad870c27a update 0-ply-tests 2018-05-22 15:15:15 +02:00
653d6e30a8 add missing comma 2018-05-22 15:12:47 +02:00
7e51b44e33 Merge branch 'experimentation' into 'master'
tesauro fat and diffs in values

See merge request Pownie/backgammon!7
2018-05-22 13:12:10 +00:00
1fd6c35baa Merge branch 'master' into 'experimentation'
# Conflicts:
#   main.py
2018-05-22 13:11:43 +00:00
d426c1c3b5 tesauro fat and diffs in values 2018-05-22 15:10:41 +02:00
5ab144cffc add git commit status to all logs 2018-05-22 14:44:13 +02:00
cef8e54709 Merge branch 'master' of gitfub.space:Pownie/backgammon 2018-05-22 14:37:46 +02:00
2efbc446f2 log git commit status in evaluation logs 2018-05-22 14:37:27 +02:00
c54f7aca24 Merge branch 'experimentation' into 'master'
Experimentation

See merge request Pownie/backgammon!6
2018-05-22 12:36:37 +00:00
c31bc39780 More server 2018-05-22 00:26:32 +02:00
6133cb439f Merge remote-tracking branch 'origin/experimentation' into experimentation 2018-05-20 20:15:57 +02:00
5acd79b6da Slight modification to move calculation 2018-05-20 19:43:28 +02:00
=
b11e783b30 add 0-ply-tests 2018-05-20 18:50:28 +02:00
f834b10e02 remove unnecessary print 2018-05-20 16:52:05 +02:00
72f01a2a2d remove dependency on yaml 2018-05-20 16:03:58 +02:00
d14e6c5994 Everything might work, except for quad, that might be bugged. 2018-05-20 00:38:13 +02:00
a266293ecd Stuff is happening, moving is better! 2018-05-19 22:01:55 +02:00
e9a46c79df server and stuff 2018-05-19 14:12:13 +02:00
816cdfae00 fix and clean 2018-05-18 14:55:10 +02:00
ff9664eb38 Merge branch 'eager_eval' into 'master'
Eager eval

See merge request Pownie/backgammon!5
2018-05-18 12:06:12 +00:00
3e379b40c4 Accidentally added a '5' in the middle of a variable. 2018-05-16 00:20:54 +02:00
90fad334b9 More optimizations. 2018-05-15 23:37:35 +02:00
a77c13a0a4 1-ply runs even faster. 2018-05-15 19:29:27 +02:00
260c32d909 oiuhhiu 2018-05-15 18:16:44 +02:00
00974b0f11 Added '--play' flag, so you can now play against the ai. 2018-05-14 13:07:48 +02:00
2c02689577 Merge remote-tracking branch 'origin/eager_eval' into eager_eval 2018-05-13 23:55:02 +02:00
926a331df0 Some flags from main.py is gone, rolls now allow a face_value of 0 yet
again and it is possible to play against the ai. There is no flag
for this yet, so this has to be added.
2018-05-13 23:54:13 +02:00
d932663519 add explanation of ply speedup 2018-05-13 22:26:24 +02:00
2312c9cb2a Merge branch 'eager_eval' of gitfub.space:Pownie/backgammon into eager_eval 2018-05-12 15:19:12 +02:00
9f1bd56c0a fix bear_off bug; addtional tests and additional fixes 2018-05-12 15:18:52 +02:00
ba4ef86bb5 Board rep can now be inferred from file after being given once.
We can also evaluate multiple times by using the flag "--repeat-eval".
The flag defaults to 1, if not provided.
2018-05-12 12:14:47 +02:00
c3f5e909d6 flip is back 2018-05-11 21:47:48 +02:00
1aa9cf705f quack without leaks 2018-05-11 21:24:10 +02:00
383dd7aa4b code works again; quack gave ~3 times improvement for calc_moves 2018-05-11 20:13:43 +02:00
93188fe06b more quack for board 2018-05-11 20:07:27 +02:00
ffbc98e1a2 quack kind of works 2018-05-11 19:00:39 +02:00
03e61a59cf quack 2018-05-11 17:29:22 +02:00
93224864a4 More comments, backprop have been somewhat tested in the eager_main.py
and normal_main.py.
2018-05-11 13:35:01 +02:00
504308a9af Yet another input argument, "--ply", 0 for no look-ahead, 1 for a single
look-ahead.
2018-05-10 23:22:41 +02:00
3b57c10b5a Saves calling tf.reduce_mean on all values once. 2018-05-10 22:57:27 +02:00
4fa10861bb update TF dependency to 1.8.0 2018-05-10 19:27:51 +02:00
6131d5b5f4 Added comments for Christoffer! 2018-05-10 19:25:28 +02:00
1aedc23de1 1-ply now works again. 2018-05-10 19:13:18 +02:00
2d84cd5a0b 1-ply now works again. 2018-05-10 19:06:53 +02:00
396d5b036d All values for boards and all rolls can now be calculated 2018-05-10 18:41:21 +02:00
4efb229d34 Added a lot of comments 2018-05-10 15:28:33 +02:00
f2a67ca92e All board reps should now work as input. 2018-05-10 10:49:25 +02:00
9cfdd7e2b2 Added a verbosity flag, --verbose, which allows for printing of
variables and such.
2018-05-10 10:39:22 +02:00
6429e0732c We should now be able to both train and eval as per usual.
I've added a file "global_step", which works as the new global_step
counter, so we can use it for exp_decay.
2018-05-09 23:15:35 +02:00
cb7e7b519c Getting closer to functionality. We're capable of evaluating moves
and a rework of global_step has begun, such that we now use
episode_count as a way of calculating exp_decay, which have been
implemented as a function.
2018-05-09 22:22:12 +02:00
9a2d87516e Ongoing rewrite of network to use an eager model. We're now capable of
evaluating a list of states with network.py. We can also save and
restore models.
2018-05-09 00:33:05 +02:00
7b308be4e2 Different implementations of different speed 2018-05-07 22:24:47 +02:00
345 changed files with 1410 additions and 1575 deletions

59
app.py
View File

@ -2,27 +2,11 @@ from flask import Flask, request, jsonify
from flask_json import FlaskJSON, as_json_p
from flask_cors import CORS
from board import Board
import tensorflow as tf
from eval import Eval
import argparse
import main
import random
from network import Network
parser = argparse.ArgumentParser(description="Backgammon games")
parser.add_argument('--model', action='store', dest='model',
default='player_testings',
help='name of Tensorflow model to use')
parser.add_argument('--board-rep', action='store', dest='board_rep',
default='tesauro',
help='name of board representation to use as input to neural network')
args = parser.parse_args()
app = Flask(__name__)
@ -33,14 +17,13 @@ json = FlaskJSON(app)
CORS(app)
config = main.config.copy()
config['model'] = args.model
config['board_representation'] = args.board_rep
config['model'] = "player_testings"
config['ply'] = "0"
config['board_representation'] = 'tesauro'
network = Network(config, config['model'])
sess = tf.Session()
sess.run(tf.global_variables_initializer())
network.restore_model(sess)
network.restore_model()
def calc_move_sets(from_board, roll, player):
board = from_board
@ -57,8 +40,8 @@ def calc_move_sets(from_board, roll, player):
def tmp_name(from_board, to_board, roll, player, total_moves, is_quad=False):
sets = calc_move_sets(from_board, roll, player)
return_board = from_board
# print("To board:\n",to_board)
# print("All sets:\n",sets)
print("To board:\n",to_board)
print("All sets:\n",sets)
for idx, board_set in enumerate(sets):
board_set[0] = list(board_set[0])
# print(to_board)
@ -102,32 +85,6 @@ def check_move(prev, curr):
return any(truth_list)
@app.route('/pubeval_move', methods=['POST'])
def pubeval_move():
data = request.get_json(force=True)
board = [int(x) for x in data['board'].split(',')]
player = int(data['player'])
roll = [int(x) for x in data['roll'].split(',')]
board, value = Eval.make_pubeval_move(tuple(board), player, roll)
print("Doing pubeval move")
return ",".join([str(x) for x in list(board)])
@app.route('/network_move', methods=['POST'])
def network_move():
data = request.get_json(force=True)
board = [int(x) for x in data['board'].split(',')]
player = int(data['player'])
roll = [int(x) for x in data['roll'].split(',')]
board, value = network.make_move(sess, tuple(board), roll, player)
print("Doing network move")
return ",".join([str(x) for x in list(board)])
@app.route('/bot_move', methods=['POST'])
def bot_move():
@ -141,7 +98,7 @@ def bot_move():
if use_pubeval:
board, value = Eval.make_pubeval_move(tuple(board), 1, roll)
else:
board, _ = network.make_move(sess, tuple(board), roll, 1)
board, _ = network.make_move(tuple(board), roll, 1)
# print("Board!:",board)

78
bin/0-ply-tests.rb Normal file
View File

@ -0,0 +1,78 @@
def run_stuff(board_rep, model_name, ply)
epi_count = 0
system("python3 main.py --train --model #{model_name} --board-rep #{board_rep} --episodes 1 --ply #{ply}")
while epi_count < 200000 do
system("python3 main.py --eval --model #{model_name} --eval-methods dumbeval --episodes 250 --ply #{ply} --repeat-eval 3")
system("python3 main.py --eval --model #{model_name} --eval-methods pubeval --episodes 250 --ply #{ply} --repeat-eval 3")
system("python3 main.py --train --model #{model_name} --episodes 2000 --ply #{ply}")
epi_count += 2000
end
end
### ///////////////////////////////////////////////////////////////
# QUACK TESTINGS
### ///////////////////////////////////////////////////////////////
board_rep = "quack"
model_name = "quack_test_0_ply"
ply = 0
run_stuff(board_rep, model_name, ply)
# board_rep = "quack"
# model_name = "quack_test_1_ply"
# ply = 1
# run_stuff(board_rep, model_name, ply)
### ///////////////////////////////////////////////////////////////
# QUACK-FAT TESTING
### ///////////////////////////////////////////////////////////////
board_rep = "quack-fat"
model_name = "quack-fat_test_0_ply"
ply = 0
run_stuff(board_rep, model_name, ply)
# board_rep = "quack-fat"
# model_name = "quack-fat_test_1_ply"
# ply = 1
# run_stuff(board_rep, model_name, ply)
### ///////////////////////////////////////////////////////////////
# QUACK-NORM TESTING
### ///////////////////////////////////////////////////////////////
board_rep = "quack-norm"
model_name = "quack-norm_test_0_ply"
ply = 0
run_stuff(board_rep, model_name, ply)
# board_rep = "quack-norm"
# model_name = "quack-norm_test_1_ply"
# ply = 1
# run_stuff(board_rep, model_name, ply)
### ///////////////////////////////////////////////////////////////
# TESAURO TESTING
### ///////////////////////////////////////////////////////////////
board_rep = "tesauro"
model_name = "tesauro_test_0_ply"
ply = 0
run_stuff(board_rep, model_name, ply)
# board_rep = "tesauro"
# model_name = "tesauro_test_1_ply"
# ply = 1
# run_stuff(board_rep, model_name, ply)

View File

@ -1,90 +0,0 @@
def run_stuff(board_rep, model_name)
epi_count = 0
system("python3 main.py --train --model #{model_name} --board-rep #{board_rep} --episodes 1 --force-creation")
while epi_count < 200000 do
for _ in (1..3) do
system("python3 main.py --eval --model #{model_name} --board-rep #{board_rep} --eval-methods dumbeval --episodes 250")
end
for _ in (1..3) do
system("python3 main.py --eval --model #{model_name} --board-rep #{board_rep} --eval-methods pubeval --episodes 250")
end
system("python3 main.py --train --model #{model_name} --board-rep #{board_rep} --episodes 2000")
epi_count += 2000
end
end
### ///////////////////////////////////////////////////////////////
# QUACK TESTINGS
### ///////////////////////////////////////////////////////////////
board_rep = "quack"
model_name = "quack_test_0_ply"
#run_stuff(board_rep, model_name)
#board_rep = "quack"
#model_name = "quack_test_1_ply"
#
#run_stuff(board_rep, model_name)
### ///////////////////////////////////////////////////////////////
# QUACK-FAT TESTING
### ///////////////////////////////////////////////////////////////
board_rep = "quack-fat"
model_name = "quack-fat_test_0_ply"
#run_stuff(board_rep, model_name)
#board_rep = "quack-fat"
#model_name = "quack-fat_test_1_ply"
#
#run_stuff(board_rep, model_name)
### ///////////////////////////////////////////////////////////////
# QUACK-NORM TESTING
### ///////////////////////////////////////////////////////////////
board_rep = "quack-norm"
model_name = "quack-norm_test_0_ply"
#run_stuff(board_rep, model_name)
#board_rep = "quack-norm"
#model_name = "quack-norm_test_1_ply"
#
#run_stuff(board_rep, model_name)
### ///////////////////////////////////////////////////////////////
# TESAURO TESTING
### ///////////////////////////////////////////////////////////////
board_rep = "tesauro"
model_name = "tesauro_test3_0_ply"
run_stuff(board_rep, model_name)
#board_rep = "tesauro"
#model_name = "tesauro_test_1_ply"
#
#run_stuff(board_rep, model_name)

View File

@ -1,30 +1,30 @@
#!/usr/bin/env ruby
MODELS_DIR = 'models'
def save(model_name)
require 'date'
models_dir = 'models'
model_path = File.join(models_dir, model_name)
if not File.exists? model_path then
return false
end
model_path = File.join(MODELS_DIR, model_name)
episode_count = (File.read File.join(model_path, 'episodes_trained')).to_i
puts "Found model #{model_name} with episodes #{episode_count} trained!"
file_name = "model-#{model_name}-#{episode_count}-#{Time.now.strftime('%Y%m%d-%H%M%S')}.tar.gz"
save_path = File.join(models_dir, 'saves', file_name)
save_path = File.join(MODELS_DIR, 'saves', file_name)
puts "Saving to #{save_path}"
system("tar", "-cvzf", save_path, "-C", models_dir, model_name)
return true
system("tar", "-cvzf", save_path, "-C", MODELS_DIR, model_name)
end
def train(model, episodes)
system("python3", "main.py", "--train", "--model", model, "--episodes", episodes.to_s)
end
def force_train(model, episodes)
system("python3", "main.py", "--train", "--force-creation", "--model", model, "--episodes", episodes.to_s)
end
def evaluate(model, episodes, method)
system("python3", "main.py", "--eval" , "--model", model, "--episodes", episodes.to_s, "--eval-methods", method)
end
@ -33,11 +33,9 @@ model = ARGV[0]
if model.nil? then raise "no model specified" end
while true do
if not File.exists? File.join(MODELS_DIR, model) then
force_train model, 10
save model
train model, 1000
save model
train model, 1000
3.times do
evaluate model, 250, "pubeval"
end
@ -45,3 +43,27 @@ while true do
evaluate model, 250, "dumbeval"
end
end
# while true do
# save model
# train model, 1000
# save model
# train model, 1000
# 3.times do
# evaluate model, 250, "pubeval"
# end
# 3.times do
# evaluate model, 250, "dumbeval"
# end
# end
while true do
save model
train model, 500
5.times do
evaluate model, 250, "pubeval"
end
5.times do
evaluate model, 250, "dumbeval"
end
end

268
board.py
View File

@ -1,3 +1,4 @@
import quack
import numpy as np
import itertools
@ -12,15 +13,9 @@ class Board:
@staticmethod
def idxs_with_checkers_of_player(board, player):
idxs = []
for idx, checker_count in enumerate(board):
if checker_count * player >= 1:
idxs.append(idx)
return idxs
return quack.idxs_with_checkers_of_player(board, player)
# TODO: Write a test for this
# TODO: Make sure that the bars fit, 0 represents the -1 player and 25 represents the 1 player
# index 26 is player 1 home, index 27 is player -1 home
@staticmethod
def board_features_to_pubeval(board, player):
@ -40,19 +35,19 @@ class Board:
def board_features_quack(board, player):
board = list(board)
board += ([1, 0] if np.sign(player) > 0 else [0, 1])
return np.array(board).reshape(1, -1)
return np.array(board).reshape(1,28)
# quack-fat
@staticmethod
def board_features_quack_fat(board, player):
board = list(board)
positives = [x if x > 0 else 0 for x in board]
negatives = [x if x < 0 else 0 for x in board]
board.append( 15 - sum(positives))
board.append(-15 - sum(negatives))
board += ([1, 0] if np.sign(player) > 0 else [0, 1])
return np.array(board).reshape(1,-1)
return np.array(quack.board_features_quack_fat(board,player)).reshape(1,30)
# board = list(board)
# positives = [x if x > 0 else 0 for x in board]
# negatives = [x if x < 0 else 0 for x in board]
# board.append( 15 - sum(positives))
# board.append(-15 - sum(negatives))
# board += ([1, 0] if np.sign(player) > 0 else [0, 1])
# return np.array(board).reshape(1,30)
# quack-fatter
@staticmethod
@ -68,7 +63,7 @@ class Board:
board.append(15 - sum(positives))
board.append(-15 - sum(negatives))
board += ([1, 0] if np.sign(player) > 0 else [0, 1])
return np.array(board).reshape(1, -1)
return np.array(board).reshape(1, 30)
# tesauro
@staticmethod
@ -97,35 +92,47 @@ class Board:
board_rep += bar_trans(board, player)
board_rep += (15 - Board.num_of_checkers_for_player(board, player),)
board_rep += ([1,0] if cur_player == 1 else [1,0])
board_rep += ([1, 0] if cur_player == 1 else [0, 1])
return np.array(board_rep).reshape(1, 198)
return np.array(board_rep).reshape(1,198)
@staticmethod
def board_features_tesauro_fat(board, cur_player):
def ordinary_trans(val, player):
abs_val = val*player
if abs_val <= 0:
return (0, 0, 0, 0, 0, 0, 0, 0, 0)
return (0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 1:
return (1, 0, 0, 0, 0, 0, 0, 0, 0)
return (1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 2:
return (1, 1, 0, 0, 0, 0, 0, 0, 0)
return (1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 3:
return (1, 1, 1, 0, 0, 0, 0, 0, 0)
return (1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 4:
return (1, 1, 1, 1, 0, 0, 0, 0, 0)
return (1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 5:
return (1, 1, 1, 1, 1, 0, 0, 0, 0)
return (1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 6:
return (1, 1, 1, 1, 1, 1, 0, 0, 0)
return (1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 7:
return (1, 1, 1, 1, 1, 1, 1, 0, 0)
return (1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 8:
return (1, 1, 1, 1, 1, 1, 1, 1, 0)
else:
return (1, 1, 1, 1, 1, 1, 1, 1, (abs_val - 9) / 2)
return (1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0)
elif abs_val == 9:
return (1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0)
elif abs_val == 10:
return (1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0)
elif abs_val == 11:
return (1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0, 0)
elif abs_val == 12:
return (1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0, 0)
elif abs_val == 13:
return (1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0, 0)
elif abs_val == 14:
return (1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 0)
elif abs_val == 15:
return (1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1)
def bar_trans(board, player):
if player == 1: return (abs(board[0]/2),)
@ -138,7 +145,7 @@ class Board:
board_rep += bar_trans(board, player)
board_rep += (15 - Board.num_of_checkers_for_player(board, player),)
board_rep += ([1, 0] if cur_player == 1 else [1,0])
board_rep += ([1, 0] if cur_player == 1 else [0, 1])
return np.array(board_rep).reshape(1, len(board_rep))
@ -165,105 +172,15 @@ class Board:
# Calculate how many pieces there must be in the home state and divide it by 15
features.append((15 - sum) / 15)
features += ([1,0] if np.sign(cur_player) > 0 else [0,1])
test = np.array(features).reshape(1,-1)
test = np.array(features)
#print("TEST:",test)
return test
return test.reshape(1,198)
@staticmethod
def is_move_valid(board, player, face_value, move):
if face_value == 0:
return True
else:
def sign(a):
return (a > 0) - (a < 0)
from_idx = move[0]
to_idx = move[1]
to_state = None
from_state = board[from_idx]
delta = to_idx - from_idx
direction = sign(delta)
bearing_off = None
# FIXME: Use get instead of array-like indexing
if to_idx >= 1 and to_idx <= 24:
to_state = board[to_idx]
bearing_off = False
else: # Bearing off
to_state = 0
bearing_off = True
# print("_"*20)
# print("board:", board)
# print("to_idx:", to_idx, "board[to_idx]:", board[to_idx], "to_state:", to_state)
# print("+"*20)
def is_forward_move():
return direction == player
def face_value_match_move_length():
return abs(delta) == face_value
def bear_in_if_checker_on_bar():
if player == 1:
bar = 0
else:
bar = 25
bar_state = board[bar]
if bar_state != 0:
return from_idx == bar
else:
return True
def checkers_at_from_idx():
return sign(from_state) == player
def no_block_at_to_idx():
if -sign(to_state) == player:
return abs(to_state) == 1
else:
return True
def can_bear_off():
checker_idxs = Board.idxs_with_checkers_of_player(board, player)
def moving_directly_off():
if player == 1:
return to_idx == 25;
if player == -1:
return to_idx == 0;
def is_moving_backmost_checker():
if player == 1:
return all([(idx >= from_idx) for idx in checker_idxs])
else:
return all([(idx <= from_idx) for idx in checker_idxs])
def all_checkers_in_last_quadrant():
if player == 1:
return all([(idx >= 19) for idx in checker_idxs])
else:
return all([(idx <= 6) for idx in checker_idxs])
return all([ moving_directly_off() or is_moving_backmost_checker(),
all_checkers_in_last_quadrant() ])
# TODO: add switch here instead of wonky ternary in all
# print("is_forward:",is_forward_move())
# print("face_value:",face_value_match_move_length())
# print("Checkes_at_from:",checkers_at_from_idx())
# print("no_block:",no_block_at_to_idx())
return all([ is_forward_move(),
face_value_match_move_length(),
bear_in_if_checker_on_bar(),
checkers_at_from_idx(),
no_block_at_to_idx(),
can_bear_off() if bearing_off else True ])
return quack.is_move_valid(board, player, face_value, move)
@staticmethod
def any_move_valid(board, player, roll):
@ -303,40 +220,37 @@ class Board:
@staticmethod
def apply_moves_to_board(board, player, moves):
for move in moves:
from_idx, to_idx = move.split("/")
board[int(from_idx)] -= int(player)
board[int(to_idx)] += int(player)
return board
def apply_moves_to_board(board, player, move):
from_idx = move[0]
to_idx = move[1]
board = list(board)
board[from_idx] -= player
if (to_idx < 1 or to_idx > 24):
return
if (board[to_idx] * player == -1):
if (player == 1):
board[25] -= player
else:
board[0] -= player
board[to_idx] = 0
board[to_idx] += player
return tuple(board)
@staticmethod
def calculate_legal_states(board, player, roll):
# Find all points with checkers on them belonging to the player
# Iterate through each index and check if it's a possible move given the roll
# TODO: make sure that it is not possible to do nothing on first part of
# turn and then do something with the second die
def calc_moves(board, face_value):
idxs_with_checkers = Board.idxs_with_checkers_of_player(board, player)
if len(idxs_with_checkers) == 0:
if face_value == 0:
return [board]
boards = [(Board.do_move(board,
player,
(idx, idx + (face_value * player)))
if Board.is_move_valid(board,
player,
face_value,
(idx, idx + (face_value * player)))
else None)
for idx in idxs_with_checkers]
# print("pls:",boards)
board_list = list(filter(None, boards)) # Remove None-values
# if len(board_list) == 0:
# return [board]
# print("board list:", board_list)
return board_list
return quack.calc_moves(board, player, face_value)
# Problem with cal_moves: Method can return empty list (should always contain at least same board).
# *Update*: Seems to be fixed.
@ -350,23 +264,17 @@ class Board:
if not Board.any_move_valid(board, player, roll):
return { board }
dice_permutations = list(itertools.permutations(roll)) if roll[0] != roll[1] else [[roll[0]]*4]
#print("Permuts:",dice_permutations)
# print("Dice permuts:",dice_permutations)
for roll in dice_permutations:
# Calculate boards resulting from first move
#print("initial board: ", board)
#print("roll:", roll)
boards = calc_moves(board, roll[0])
#print("boards after first die: ", boards)
for die in roll[1:]:
# Calculate boards resulting from second move
nested_boards = [calc_moves(board, die) for board in boards]
#print("nested boards: ", nested_boards)
boards = [board for boards in nested_boards for board in boards]
# What the fuck
#for board in boards:
# print(board)
# print("type__:",type(board))
# Add resulting unique boards to set of legal boards resulting from roll
#print("printing boards from calculate_legal_states: ", boards)
@ -395,9 +303,9 @@ class Board:
return """
13 14 15 16 17 18 19 20 21 22 23 24
+--------------------------------------------------------------------------+
| {13}| {14}| {15}| {16}| {17}| {18}| bar -1: {25} | {19}| {20}| {21}| {22}| {23}| {24}| end -1: TODO|
| {13}| {14}| {15}| {16}| {17}| {18}| bar -1: {25} | {19}| {20}| {21}| {22}| {23}| {24}| end 1: TODO|
|---|---|---|---|---|---|------------|---|---|---|---|---|---| |
| {12}| {11}| {10}| {9}| {8}| {7}| bar 1: {0} | {6}| {5}| {4}| {3}| {2}| {1}| end 1: TODO|
| {12}| {11}| {10}| {9}| {8}| {7}| bar 1: {0} | {6}| {5}| {4}| {3}| {2}| {1}| end -1: TODO|
+--------------------------------------------------------------------------+
12 11 10 9 8 7 6 5 4 3 2 1
""".format(*temp)
@ -405,42 +313,8 @@ class Board:
@staticmethod
def do_move(board, player, move):
# Implies that move is valid; make sure to check move validity before calling do_move(...)
def move_to_bar(board, to_idx):
board = list(board)
if player == 1:
board[25] -= player
else:
board[0] -= player
board[to_idx] = 0
return board
return quack.do_move(board, player, move)
# TODO: Moving in from bar is handled by the representation
# TODONE: Handle bearing off
from_idx = move[0]
#print("from_idx: ", from_idx)
to_idx = move[1]
#print("to_idx: ", to_idx)
# pdb.set_trace()
board = list(board) # Make mutable copy of board
# 'Lift' checker
board[from_idx] -= player
# Handle bearing off
if to_idx < 1 or to_idx > 24:
return tuple(board)
# Handle hitting checkers
if board[to_idx] * player == -1:
board = move_to_bar(board, to_idx)
# Put down checker
board[to_idx] += player
return tuple(board)
@staticmethod
def flip(board):

84
bot.py
View File

@ -1,24 +1,8 @@
from cup import Cup
from network import Network
from board import Board
import tensorflow as tf
import numpy as np
import random
class Bot:
def __init__(self, sym, config = None, name = "unnamed"):
self.config = config
self.cup = Cup()
def __init__(self, sym):
self.sym = sym
self.graph = tf.Graph()
self.network = Network(config, name)
self.network.restore_model()
def restore_model(self):
with self.graph.as_default():
self.network.restore_model()
def get_session(self):
return self.session
@ -26,16 +10,60 @@ class Bot:
def get_sym(self):
return self.sym
def get_network(self):
return self.network
# TODO: DEPRECATE
def make_move(self, board, sym, roll):
# print(Board.pretty(board))
legal_moves = Board.calculate_legal_states(board, sym, roll)
moves_and_scores = [ (move, self.network.eval_state(np.array(move).reshape(1,26))) for move in legal_moves ]
scores = [ x[1] for x in moves_and_scores ]
best_move_pair = moves_and_scores[np.array(scores).argmax()]
#print("Found the best state, being:", np.array(move_scores).argmax())
return best_move_pair
def calc_move_sets(self, from_board, roll, player):
board = from_board
sets = []
total = 0
print("board!:",board)
for r in roll:
# print("Value of r:",r)
sets.append([Board.calculate_legal_states(board, player, [r,0]), r])
total += r
sets.append([Board.calculate_legal_states(board, player, [total,0]), total])
return sets
def handle_move(self, from_board, to_board, roll, player):
# print("Cur board:",board)
sets = self.calc_move_sets(from_board, roll, player)
for idx, board_set in enumerate(sets):
board_set[0] = list(board_set[0])
# print("My board_set:",board_set)
if to_board in [list(c) for c in board_set[0]]:
self.total_moves -= board_set[1]
if idx < 2:
# print("Roll object:",self.roll)
self.roll[idx] = 0
else:
self.roll = [0,0]
break
print("Total moves left:",self.total_moves)
def tmp_name(self, from_board, to_board, roll, player, total_moves):
sets = self.calc_move_sets(from_board, roll, player)
return_board = from_board
for idx, board_set in enumerate(sets):
board_set = list(board_set[0])
if to_board in [list(board) for board in board_set]:
total_moves -= board_set[1]
# if it's not the sum of the moves
if idx < 2:
roll[idx] = 0
else:
roll = [0,0]
return_board = to_board
break
return total_moves, roll, return_board
def make_human_move(self, board, player, roll):
total_moves = roll[0] + roll[1]
previous_board = board
while total_moves != 0:
move = input("Pick a move!\n")
to_board = Board.apply_moves_to_board(previous_board, player, move)
total_moves, roll, board = self.tmp_name(board, to_board, roll, player, total_moves)

101
main.py
View File

@ -2,6 +2,7 @@ import argparse
import sys
import os
import time
import subprocess
# Parse command line arguments
parser = argparse.ArgumentParser(description="Backgammon games")
@ -31,17 +32,17 @@ parser.add_argument('--train-perpetually', action='store_true',
help='start new training session as soon as the previous is finished')
parser.add_argument('--list-models', action='store_true',
help='list all known models')
parser.add_argument('--force-creation', action='store_true',
help='force model creation if model does not exist')
parser.add_argument('--board-rep', action='store', dest='board_rep',
default='tesauro',
help='name of board representation to use as input to neural network')
parser.add_argument('--verbose', action='store_true',
help='If set, a lot of stuff will be printed')
parser.add_argument('--ply', action='store', dest='ply', default='0',
help='defines the amount of ply used when deciding what move to make')
parser.add_argument('--repeat-eval', action='store', dest='repeat_eval', default='1',
help='the amount of times the evaluation method should be repeated')
args = parser.parse_args()
if args.model == "baseline_model":
print("Model name 'baseline_model' not allowed")
exit()
config = {
'model': args.model,
@ -57,8 +58,13 @@ config = {
'model_storage_path': 'models',
'bench_storage_path': 'bench',
'board_representation': args.board_rep,
'global_step': 0,
'verbose': args.verbose,
'ply': args.ply,
'repeat_eval': args.repeat_eval
}
# Create models folder
if not os.path.exists(config['model_storage_path']):
os.makedirs(config['model_storage_path'])
@ -72,19 +78,20 @@ if not os.path.isdir(model_path()):
if not os.path.isdir(log_path):
os.mkdir(log_path)
# Define helper functions
def log_train_outcome(outcome, diff_in_values, trained_eps = 0, log_path = os.path.join(model_path(), 'logs', "train.log")):
commit = subprocess.run(['git', 'describe', '--first-parent', '--always'], stdout=subprocess.PIPE).stdout.decode('utf-8').rstrip()
format_vars = { 'trained_eps': trained_eps,
'count': len(outcome),
'sum': sum(outcome),
'mean': sum(outcome) / len(outcome),
'time': int(time.time()),
'average_diff_in_vals': diff_in_values/len(outcome)
'average_diff_in_vals': diff_in_values,
'commit': commit
}
with open(log_path, 'a+') as f:
f.write("{time};{trained_eps};{count};{sum};{mean};{average_diff_in_vals}".format(**format_vars) + "\n")
f.write("{time};{trained_eps};{count};{sum};{mean};{average_diff_in_vals};{commit}".format(**format_vars) + "\n")
def log_eval_outcomes(outcomes, trained_eps = 0, log_path = os.path.join(model_path(), 'logs', "eval.log")):
@ -95,9 +102,12 @@ def log_eval_outcomes(outcomes, trained_eps = 0, log_path = os.path.join(model_p
:param log_path:
:return:
"""
commit = subprocess.run(['git', 'describe', '--first-parent', '--always'], stdout=subprocess.PIPE).stdout.decode('utf-8').rstrip()
for outcome in outcomes:
scores = outcome[1]
format_vars = { 'trained_eps': trained_eps,
format_vars = { 'commit': commit,
'trained_eps': trained_eps,
'method': outcome[0],
'count': len(scores),
'sum': sum(scores),
@ -105,9 +115,10 @@ def log_eval_outcomes(outcomes, trained_eps = 0, log_path = os.path.join(model_p
'time': int(time.time())
}
with open(log_path, 'a+') as f:
f.write("{time};{method};{trained_eps};{count};{sum};{mean}".format(**format_vars) + "\n")
f.write("{time};{method};{trained_eps};{count};{sum};{mean};{commit}".format(**format_vars) + "\n")
def log_bench_eval_outcomes(outcomes, log_path, index, time, trained_eps = 0):
commit = subprocess.run(['git', 'describe', '--first-parent', '--always'], stdout=subprocess.PIPE).stdout.decode('utf-8').rstrip()
for outcome in outcomes:
scores = outcome[1]
format_vars = { 'trained_eps': trained_eps,
@ -117,9 +128,28 @@ def log_bench_eval_outcomes(outcomes, log_path, index, time, trained_eps = 0):
'mean': sum(scores) / len(scores),
'time': time,
'index': index,
'commit': commit
}
with open(log_path, 'a+') as f:
f.write("{method};{count};{index};{time};{sum};{mean}".format(**format_vars) + "\n")
f.write("{method};{count};{index};{time};{sum};{mean};{commit}".format(**format_vars) + "\n")
def find_board_rep():
checkpoint_path = os.path.join(config['model_storage_path'], config['model'])
board_rep_path = os.path.join(checkpoint_path, "board_representation")
with open(board_rep_path, 'r') as f:
return f.read()
def board_rep_file_exists():
checkpoint_path = os.path.join(config['model_storage_path'], config['model'])
board_rep_path = os.path.join(checkpoint_path, "board_representation")
return os.path.isfile(board_rep_path)
def create_board_rep():
checkpoint_path = os.path.join(config['model_storage_path'], config['model'])
board_rep_path = os.path.join(checkpoint_path, "board_representation")
with open(board_rep_path, 'a+') as f:
f.write(config['board_representation'])
# Do actions specified by command-line
if args.list_models:
@ -143,6 +173,22 @@ if __name__ == "__main__":
# Set up variables
episode_count = config['episode_count']
if config['board_representation'] is None:
if board_rep_file_exists():
config['board_representation'] = find_board_rep()
else:
sys.stderr.write("Was not given a board_rep and was unable to find a board_rep file\n")
exit()
else:
if not board_rep_file_exists():
create_board_rep()
else:
if config['board_representation'] != find_board_rep():
sys.stderr.write("Board representation \"{given}\", does not match one in board_rep file, \"{board_rep}\"\n".
format(given = config['board_representation'], board_rep = find_board_rep()))
exit()
if args.train:
network = Network(config, config['model'])
@ -157,15 +203,21 @@ if __name__ == "__main__":
if not config['train_perpetually']:
break
elif args.play:
network = Network(config, config['model'])
network.play_against_network()
elif args.eval:
network = Network(config, config['model'])
start_episode = network.episodes_trained
# Evaluation measures are described in `config`
outcomes = network.eval(config['episode_count'])
log_eval_outcomes(outcomes, trained_eps = start_episode)
# elif args.play:
# g.play(episodes = episode_count)
network.restore_model()
for i in range(int(config['repeat_eval'])):
start_episode = network.episodes_trained
# Evaluation measures are described in `config`
outcomes = network.eval(config['episode_count'])
log_eval_outcomes(outcomes, trained_eps = start_episode)
# elif args.play:
# g.play(episodes = episode_count)
elif args.bench_eval_scores:
@ -187,7 +239,7 @@ if __name__ == "__main__":
episode_counts = [25, 50, 100, 250, 500, 1000, 2500, 5000,
10000, 20000]
def do_eval(sess):
def do_eval():
for eval_method in config['eval_methods']:
result_path = os.path.join(config['bench_storage_path'],
eval_method) + "-{}.log".format(int(time.time()))
@ -195,8 +247,7 @@ if __name__ == "__main__":
for i in range(sample_count):
start_time = time.time()
# Evaluation measure to be benchmarked are described in `config`
outcomes = network.eval(episode_count = n,
tf_session = sess)
outcomes = network.eval(episode_count = n)
time_diff = time.time() - start_time
log_bench_eval_outcomes(outcomes,
time = time_diff,
@ -206,8 +257,8 @@ if __name__ == "__main__":
# CMM: oh no
import tensorflow as tf
with tf.Session() as session:
network.restore_model(session)
do_eval(session)
network.restore_model()
do_eval()

View File

@ -1,2 +0,0 @@
model_checkpoint_path: "model.ckpt-11397483"
all_model_checkpoint_paths: "model.ckpt-11397483"

View File

@ -1 +0,0 @@
202615

View File

@ -1,615 +0,0 @@
1528532690;dumbeval;1;250;-332;-1.328
1528532754;dumbeval;1;250;-316;-1.264
1528532816;dumbeval;1;250;-324;-1.296
1528532851;pubeval;1;250;-405;-1.62
1528532886;pubeval;1;250;-384;-1.536
1528532921;pubeval;1;250;-393;-1.572
1528533481;dumbeval;2001;250;94;0.376
1528533510;dumbeval;2001;250;83;0.332
1528533537;dumbeval;2001;250;116;0.464
1528533562;pubeval;2001;250;-91;-0.364
1528533586;pubeval;2001;250;-122;-0.488
1528533611;pubeval;2001;250;-61;-0.244
1528534124;dumbeval;4001;250;241;0.964
1528534150;dumbeval;4001;250;243;0.972
1528534175;dumbeval;4001;250;246;0.984
1528534199;pubeval;4001;250;4;0.016
1528534223;pubeval;4001;250;28;0.112
1528534247;pubeval;4001;250;8;0.032
1528534758;dumbeval;6001;250;270;1.08
1528534783;dumbeval;6001;250;244;0.976
1528534808;dumbeval;6001;250;238;0.952
1528534832;pubeval;6001;250;56;0.224
1528534856;pubeval;6001;250;35;0.14
1528534880;pubeval;6001;250;56;0.224
1528535389;dumbeval;8001;250;250;1.0
1528535415;dumbeval;8001;250;238;0.952
1528535440;dumbeval;8001;250;234;0.936
1528535463;pubeval;8001;250;32;0.128
1528535487;pubeval;8001;250;34;0.136
1528535511;pubeval;8001;250;40;0.16
1528536029;dumbeval;10001;250;290;1.16
1528536056;dumbeval;10001;250;314;1.256
1528536084;dumbeval;10001;250;292;1.168
1528536110;pubeval;10001;250;100;0.4
1528536136;pubeval;10001;250;88;0.352
1528536162;pubeval;10001;250;63;0.252
1528536683;dumbeval;12001;250;323;1.292
1528536713;dumbeval;12001;250;291;1.164
1528536742;dumbeval;12001;250;338;1.352
1528536770;pubeval;12001;250;99;0.396
1528536797;pubeval;12001;250;89;0.356
1528536825;pubeval;12001;250;57;0.228
1528537350;dumbeval;14001;250;316;1.264
1528537378;dumbeval;14001;250;284;1.136
1528537406;dumbeval;14001;250;301;1.204
1528537431;pubeval;14001;250;103;0.412
1528537457;pubeval;14001;250;112;0.448
1528537483;pubeval;14001;250;80;0.32
1528538031;dumbeval;16001;250;322;1.288
1528538060;dumbeval;16001;250;314;1.256
1528538090;dumbeval;16001;250;331;1.324
1528538116;pubeval;16001;250;134;0.536
1528538144;pubeval;16001;250;119;0.476
1528538172;pubeval;16001;250;114;0.456
1528538718;dumbeval;18001;250;329;1.316
1528538750;dumbeval;18001;250;358;1.432
1528538780;dumbeval;18001;250;349;1.396
1528538807;pubeval;18001;250;120;0.48
1528538836;pubeval;18001;250;173;0.692
1528538865;pubeval;18001;250;148;0.592
1528539432;dumbeval;20001;250;370;1.48
1528539465;dumbeval;20001;250;361;1.444
1528539497;dumbeval;20001;250;338;1.352
1528539527;pubeval;20001;250;146;0.584
1528539556;pubeval;20001;250;155;0.62
1528539586;pubeval;20001;250;137;0.548
1528540163;dumbeval;22001;250;371;1.484
1528540195;dumbeval;22001;250;359;1.436
1528540227;dumbeval;22001;250;371;1.484
1528540256;pubeval;22001;250;152;0.608
1528540285;pubeval;22001;250;157;0.628
1528540316;pubeval;22001;250;125;0.5
1528540938;dumbeval;24001;250;404;1.616
1528540973;dumbeval;24001;250;416;1.664
1528541010;dumbeval;24001;250;414;1.656
1528541044;pubeval;24001;250;204;0.816
1528541077;pubeval;24001;250;176;0.704
1528541111;pubeval;24001;250;175;0.7
1528541771;dumbeval;26001;250;399;1.596
1528541806;dumbeval;26001;250;385;1.54
1528541843;dumbeval;26001;250;414;1.656
1528541877;pubeval;26001;250;144;0.576
1528541910;pubeval;26001;250;138;0.552
1528541944;pubeval;26001;250;174;0.696
1528542626;dumbeval;28001;250;408;1.632
1528542663;dumbeval;28001;250;398;1.592
1528542700;dumbeval;28001;250;394;1.576
1528542733;pubeval;28001;250;167;0.668
1528542766;pubeval;28001;250;176;0.704
1528542799;pubeval;28001;250;171;0.684
1528543480;dumbeval;30001;250;399;1.596
1528543516;dumbeval;30001;250;408;1.632
1528543551;dumbeval;30001;250;379;1.516
1528543583;pubeval;30001;250;199;0.796
1528543615;pubeval;30001;250;169;0.676
1528543648;pubeval;30001;250;161;0.644
1528544301;dumbeval;32001;250;374;1.496
1528544337;dumbeval;32001;250;385;1.54
1528544374;dumbeval;32001;250;376;1.504
1528544407;pubeval;32001;250;202;0.808
1528544439;pubeval;32001;250;173;0.692
1528544472;pubeval;32001;250;147;0.588
1528545140;dumbeval;34001;250;418;1.672
1528545180;dumbeval;34001;250;432;1.728
1528545218;dumbeval;34001;250;423;1.692
1528545252;pubeval;34001;250;185;0.74
1528545285;pubeval;34001;250;181;0.724
1528545318;pubeval;34001;250;189;0.756
1528545977;dumbeval;36001;250;427;1.708
1528546016;dumbeval;36001;250;415;1.66
1528546056;dumbeval;36001;250;449;1.796
1528546090;pubeval;36001;250;168;0.672
1528546123;pubeval;36001;250;174;0.696
1528546159;pubeval;36001;250;195;0.78
1528546826;dumbeval;38001;250;434;1.736
1528546867;dumbeval;38001;250;431;1.724
1528546909;dumbeval;38001;250;420;1.68
1528546946;pubeval;38001;250;163;0.652
1528546982;pubeval;38001;250;144;0.576
1528547019;pubeval;38001;250;152;0.608
1528547711;dumbeval;40001;250;412;1.648
1528547752;dumbeval;40001;250;436;1.744
1528547794;dumbeval;40001;250;419;1.676
1528547829;pubeval;40001;250;174;0.696
1528547866;pubeval;40001;250;193;0.772
1528547901;pubeval;40001;250;123;0.492
1528548587;dumbeval;42001;250;427;1.708
1528548629;dumbeval;42001;250;440;1.76
1528548671;dumbeval;42001;250;445;1.78
1528548707;pubeval;42001;250;208;0.832
1528548743;pubeval;42001;250;182;0.728
1528548778;pubeval;42001;250;189;0.756
1528549493;dumbeval;44001;250;430;1.72
1528549536;dumbeval;44001;250;423;1.692
1528549580;dumbeval;44001;250;432;1.728
1528549616;pubeval;44001;250;138;0.552
1528549651;pubeval;44001;250;172;0.688
1528549687;pubeval;44001;250;152;0.608
1528550418;dumbeval;46001;250;457;1.828
1528550458;dumbeval;46001;250;449;1.796
1528550504;dumbeval;46001;250;445;1.78
1528550539;pubeval;46001;250;232;0.928
1528550574;pubeval;46001;250;205;0.82
1528550609;pubeval;46001;250;189;0.756
1528551309;dumbeval;48001;250;434;1.736
1528551348;dumbeval;48001;250;422;1.688
1528551390;dumbeval;48001;250;431;1.724
1528551424;pubeval;48001;250;173;0.692
1528551459;pubeval;48001;250;174;0.696
1528551493;pubeval;48001;250;174;0.696
1528552202;dumbeval;50001;250;446;1.784
1528552245;dumbeval;50001;250;434;1.736
1528552288;dumbeval;50001;250;452;1.808
1528552324;pubeval;50001;250;193;0.772
1528552360;pubeval;50001;250;194;0.776
1528552397;pubeval;50001;250;139;0.556
1528553100;dumbeval;52001;250;444;1.776
1528553148;dumbeval;52001;250;440;1.76
1528553194;dumbeval;52001;250;444;1.776
1528553231;pubeval;52001;250;170;0.68
1528553269;pubeval;52001;250;196;0.784
1528553305;pubeval;52001;250;172;0.688
1528554021;dumbeval;54001;250;434;1.736
1528554065;dumbeval;54001;250;435;1.74
1528554109;dumbeval;54001;250;437;1.748
1528554144;pubeval;54001;250;175;0.7
1528554178;pubeval;54001;250;146;0.584
1528554214;pubeval;54001;250;175;0.7
1528554922;dumbeval;56001;250;452;1.808
1528554967;dumbeval;56001;250;450;1.8
1528555011;dumbeval;56001;250;456;1.824
1528555046;pubeval;56001;250;169;0.676
1528555083;pubeval;56001;250;156;0.624
1528555120;pubeval;56001;250;185;0.74
1528555817;dumbeval;58001;250;437;1.748
1528555860;dumbeval;58001;250;445;1.78
1528555904;dumbeval;58001;250;451;1.804
1528555940;pubeval;58001;250;193;0.772
1528555975;pubeval;58001;250;186;0.744
1528556011;pubeval;58001;250;156;0.624
1528556714;dumbeval;60001;250;446;1.784
1528556756;dumbeval;60001;250;454;1.816
1528556798;dumbeval;60001;250;436;1.744
1528556832;pubeval;60001;250;197;0.788
1528556867;pubeval;60001;250;175;0.7
1528556901;pubeval;60001;250;186;0.744
1528557607;dumbeval;62001;250;452;1.808
1528557648;dumbeval;62001;250;449;1.796
1528557692;dumbeval;62001;250;448;1.792
1528557726;pubeval;62001;250;201;0.804
1528557761;pubeval;62001;250;164;0.656
1528557797;pubeval;62001;250;226;0.904
1528558492;dumbeval;64001;250;441;1.764
1528558535;dumbeval;64001;250;445;1.78
1528558579;dumbeval;64001;250;461;1.844
1528558614;pubeval;64001;250;224;0.896
1528558649;pubeval;64001;250;193;0.772
1528558681;pubeval;64001;250;174;0.696
1528559382;dumbeval;66001;250;441;1.764
1528559425;dumbeval;66001;250;448;1.792
1528559470;dumbeval;66001;250;450;1.8
1528559503;pubeval;66001;250;195;0.78
1528559538;pubeval;66001;250;168;0.672
1528559572;pubeval;66001;250;206;0.824
1528560256;dumbeval;68001;250;437;1.748
1528560301;dumbeval;68001;250;456;1.824
1528560344;dumbeval;68001;250;458;1.832
1528560379;pubeval;68001;250;186;0.744
1528560412;pubeval;68001;250;179;0.716
1528560448;pubeval;68001;250;214;0.856
1528561158;dumbeval;70001;250;450;1.8
1528561201;dumbeval;70001;250;433;1.732
1528561245;dumbeval;70001;250;441;1.764
1528561283;pubeval;70001;250;172;0.688
1528561319;pubeval;70001;250;221;0.884
1528561356;pubeval;70001;250;171;0.684
1528562042;dumbeval;72001;250;446;1.784
1528562084;dumbeval;72001;250;434;1.736
1528562126;dumbeval;72001;250;455;1.82
1528562160;pubeval;72001;250;177;0.708
1528562194;pubeval;72001;250;182;0.728
1528562228;pubeval;72001;250;213;0.852
1528562926;dumbeval;74001;250;443;1.772
1528562970;dumbeval;74001;250;456;1.824
1528563019;dumbeval;74001;250;441;1.764
1528563059;pubeval;74001;250;162;0.648
1528563096;pubeval;74001;250;185;0.74
1528563133;pubeval;74001;250;199;0.796
1528563853;dumbeval;76001;250;449;1.796
1528563900;dumbeval;76001;250;449;1.796
1528563945;dumbeval;76001;250;438;1.752
1528563981;pubeval;76001;250;197;0.788
1528564017;pubeval;76001;250;187;0.748
1528564053;pubeval;76001;250;181;0.724
1528564751;dumbeval;78001;250;447;1.788
1528564794;dumbeval;78001;250;433;1.732
1528564839;dumbeval;78001;250;451;1.804
1528564871;pubeval;78001;250;242;0.968
1528564906;pubeval;78001;250;215;0.86
1528564940;pubeval;78001;250;211;0.844
1528565634;dumbeval;80001;250;451;1.804
1528565678;dumbeval;80001;250;445;1.78
1528565731;dumbeval;80001;250;443;1.772
1528565767;pubeval;80001;250;183;0.732
1528565804;pubeval;80001;250;221;0.884
1528565843;pubeval;80001;250;196;0.784
1528566622;dumbeval;82001;250;449;1.796
1528566668;dumbeval;82001;250;449;1.796
1528566716;dumbeval;82001;250;440;1.76
1528566753;pubeval;82001;250;179;0.716
1528566790;pubeval;82001;250;186;0.744
1528566827;pubeval;82001;250;178;0.712
1528567607;dumbeval;84001;250;451;1.804
1528567657;dumbeval;84001;250;452;1.808
1528567706;dumbeval;84001;250;437;1.748
1528567744;pubeval;84001;250;184;0.736
1528567784;pubeval;84001;250;208;0.832
1528567825;pubeval;84001;250;178;0.712
1528568626;dumbeval;86001;250;427;1.708
1528568675;dumbeval;86001;250;424;1.696
1528568723;dumbeval;86001;250;423;1.692
1528568761;pubeval;86001;250;170;0.68
1528568802;pubeval;86001;250;165;0.66
1528568840;pubeval;86001;250;187;0.748
1528569620;dumbeval;88001;250;451;1.804
1528569667;dumbeval;88001;250;452;1.808
1528569716;dumbeval;88001;250;433;1.732
1528569756;pubeval;88001;250;220;0.88
1528569796;pubeval;88001;250;213;0.852
1528569834;pubeval;88001;250;184;0.736
1528570567;dumbeval;90001;250;449;1.796
1528570617;dumbeval;90001;250;430;1.72
1528570665;dumbeval;90001;250;443;1.772
1528570702;pubeval;90001;250;219;0.876
1528570738;pubeval;90001;250;199;0.796
1528570776;pubeval;90001;250;230;0.92
1528571521;dumbeval;92001;250;456;1.824
1528571571;dumbeval;92001;250;459;1.836
1528571619;dumbeval;92001;250;438;1.752
1528571658;pubeval;92001;250;205;0.82
1528571697;pubeval;92001;250;167;0.668
1528571735;pubeval;92001;250;186;0.744
1528572464;dumbeval;94001;250;438;1.752
1528572506;dumbeval;94001;250;435;1.74
1528572553;dumbeval;94001;250;444;1.776
1528572588;pubeval;94001;250;208;0.832
1528572624;pubeval;94001;250;206;0.824
1528572659;pubeval;94001;250;161;0.644
1528573338;dumbeval;96001;250;459;1.836
1528573379;dumbeval;96001;250;436;1.744
1528573423;dumbeval;96001;250;449;1.796
1528573459;pubeval;96001;250;236;0.944
1528573493;pubeval;96001;250;250;1.0
1528573527;pubeval;96001;250;207;0.828
1528574257;dumbeval;98001;250;450;1.8
1528574308;dumbeval;98001;250;448;1.792
1528574360;dumbeval;98001;250;448;1.792
1528574402;pubeval;98001;250;206;0.824
1528574443;pubeval;98001;250;210;0.84
1528574486;pubeval;98001;250;191;0.764
1528575288;dumbeval;100001;250;430;1.72
1528575331;dumbeval;100001;250;435;1.74
1528575375;dumbeval;100001;250;449;1.796
1528575414;pubeval;100001;250;212;0.848
1528575453;pubeval;100001;250;183;0.732
1528575489;pubeval;100001;250;181;0.724
1528576174;dumbeval;102001;250;427;1.708
1528576220;dumbeval;102001;250;422;1.688
1528576267;dumbeval;102001;250;431;1.724
1528576309;pubeval;102001;250;198;0.792
1528576345;pubeval;102001;250;181;0.724
1528576383;pubeval;102001;250;170;0.68
1528577157;dumbeval;104001;250;429;1.716
1528577200;dumbeval;104001;250;430;1.72
1528577241;dumbeval;104001;250;432;1.728
1528577277;pubeval;104001;250;189;0.756
1528577311;pubeval;104001;250;174;0.696
1528577347;pubeval;104001;250;203;0.812
1528578044;dumbeval;106001;250;451;1.804
1528578088;dumbeval;106001;250;442;1.768
1528578133;dumbeval;106001;250;426;1.704
1528578170;pubeval;106001;250;211;0.844
1528578207;pubeval;106001;250;168;0.672
1528578243;pubeval;106001;250;169;0.676
1528578940;dumbeval;108001;250;456;1.824
1528578985;dumbeval;108001;250;439;1.756
1528579030;dumbeval;108001;250;442;1.768
1528579068;pubeval;108001;250;180;0.72
1528579104;pubeval;108001;250;173;0.692
1528579141;pubeval;108001;250;216;0.864
1528579824;dumbeval;110001;250;433;1.732
1528579867;dumbeval;110001;250;434;1.736
1528579910;dumbeval;110001;250;445;1.78
1528579945;pubeval;110001;250;208;0.832
1528579981;pubeval;110001;250;190;0.76
1528580018;pubeval;110001;250;169;0.676
1528580691;dumbeval;112001;250;434;1.736
1528580735;dumbeval;112001;250;440;1.76
1528580779;dumbeval;112001;250;435;1.74
1528580815;pubeval;112001;250;179;0.716
1528580851;pubeval;112001;250;200;0.8
1528580886;pubeval;112001;250;197;0.788
1528581575;dumbeval;114001;250;444;1.776
1528581619;dumbeval;114001;250;430;1.72
1528581660;dumbeval;114001;250;422;1.688
1528581697;pubeval;114001;250;188;0.752
1528581731;pubeval;114001;250;194;0.776
1528581767;pubeval;114001;250;211;0.844
1528582462;dumbeval;116001;250;432;1.728
1528582508;dumbeval;116001;250;439;1.756
1528582556;dumbeval;116001;250;436;1.744
1528582594;pubeval;116001;250;195;0.78
1528582631;pubeval;116001;250;194;0.776
1528582667;pubeval;116001;250;184;0.736
1528583376;dumbeval;118001;250;426;1.704
1528583419;dumbeval;118001;250;442;1.768
1528583466;dumbeval;118001;250;424;1.696
1528583502;pubeval;118001;250;192;0.768
1528583538;pubeval;118001;250;195;0.78
1528583573;pubeval;118001;250;189;0.756
1528584264;dumbeval;120001;250;435;1.74
1528584308;dumbeval;120001;250;440;1.76
1528584351;dumbeval;120001;250;433;1.732
1528584387;pubeval;120001;250;183;0.732
1528584422;pubeval;120001;250;182;0.728
1528584459;pubeval;120001;250;234;0.936
1528585138;dumbeval;122001;250;456;1.824
1528585183;dumbeval;122001;250;440;1.76
1528585226;dumbeval;122001;250;455;1.82
1528585263;pubeval;122001;250;186;0.744
1528585301;pubeval;122001;250;187;0.748
1528585339;pubeval;122001;250;215;0.86
1528586022;dumbeval;124001;250;436;1.744
1528586068;dumbeval;124001;250;432;1.728
1528586114;dumbeval;124001;250;447;1.788
1528589963;dumbeval;124002;250;440;1.76
1528590014;dumbeval;124002;250;441;1.764
1528590068;dumbeval;124002;250;461;1.844
1528590109;pubeval;124002;250;188;0.752
1528590147;pubeval;124002;250;185;0.74
1528590188;pubeval;124002;250;191;0.764
1528590916;dumbeval;126002;250;456;1.824
1528590961;dumbeval;126002;250;422;1.688
1528591007;dumbeval;126002;250;419;1.676
1528591044;pubeval;126002;250;182;0.728
1528591082;pubeval;126002;250;209;0.836
1528591120;pubeval;126002;250;173;0.692
1528591832;dumbeval;128002;250;441;1.764
1528591877;dumbeval;128002;250;431;1.724
1528591921;dumbeval;128002;250;433;1.732
1528591958;pubeval;128002;250;194;0.776
1528591997;pubeval;128002;250;229;0.916
1528592036;pubeval;128002;250;198;0.792
1528592764;dumbeval;130002;250;435;1.74
1528592808;dumbeval;130002;250;441;1.764
1528592852;dumbeval;130002;250;432;1.728
1528592889;pubeval;130002;250;200;0.8
1528592925;pubeval;130002;250;199;0.796
1528592962;pubeval;130002;250;234;0.936
1528593686;dumbeval;132002;250;443;1.772
1528593730;dumbeval;132002;250;430;1.72
1528593773;dumbeval;132002;250;423;1.692
1528593811;pubeval;132002;250;189;0.756
1528593850;pubeval;132002;250;231;0.924
1528593889;pubeval;132002;250;225;0.9
1528594607;dumbeval;134002;250;432;1.728
1528594653;dumbeval;134002;250;427;1.708
1528594698;dumbeval;134002;250;451;1.804
1528594733;pubeval;134002;250;151;0.604
1528594769;pubeval;134002;250;196;0.784
1528594806;pubeval;134002;250;207;0.828
1528595525;dumbeval;136002;250;454;1.816
1528595570;dumbeval;136002;250;447;1.788
1528595615;dumbeval;136002;250;437;1.748
1528595652;pubeval;136002;250;230;0.92
1528595689;pubeval;136002;250;239;0.956
1528595726;pubeval;136002;250;211;0.844
1528596455;dumbeval;138002;250;448;1.792
1528596502;dumbeval;138002;250;448;1.792
1528596549;dumbeval;138002;250;444;1.776
1528596587;pubeval;138002;250;221;0.884
1528596627;pubeval;138002;250;211;0.844
1528596667;pubeval;138002;250;182;0.728
1528597400;dumbeval;140002;250;419;1.676
1528597445;dumbeval;140002;250;440;1.76
1528597490;dumbeval;140002;250;445;1.78
1528597527;pubeval;140002;250;182;0.728
1528597565;pubeval;140002;250;206;0.824
1528597602;pubeval;140002;250;181;0.724
1528598343;dumbeval;142002;250;446;1.784
1528598389;dumbeval;142002;250;433;1.732
1528598434;dumbeval;142002;250;442;1.768
1528598470;pubeval;142002;250;185;0.74
1528598507;pubeval;142002;250;190;0.76
1528598545;pubeval;142002;250;191;0.764
1528599253;dumbeval;144002;250;443;1.772
1528599298;dumbeval;144002;250;448;1.792
1528599344;dumbeval;144002;250;441;1.764
1528599381;pubeval;144002;250;186;0.744
1528599420;pubeval;144002;250;214;0.856
1528599459;pubeval;144002;250;199;0.796
1528600193;dumbeval;146002;250;428;1.712
1528600236;dumbeval;146002;250;424;1.696
1528600280;dumbeval;146002;250;446;1.784
1528600315;pubeval;146002;250;208;0.832
1528600353;pubeval;146002;250;184;0.736
1528600389;pubeval;146002;250;233;0.932
1528601105;dumbeval;148002;250;432;1.728
1528601151;dumbeval;148002;250;451;1.804
1528601195;dumbeval;148002;250;449;1.796
1528601233;pubeval;148002;250;229;0.916
1528601271;pubeval;148002;250;189;0.756
1528601308;pubeval;148002;250;214;0.856
1528602011;dumbeval;150002;250;440;1.76
1528602052;dumbeval;150002;250;441;1.764
1528602097;dumbeval;150002;250;434;1.736
1528602133;pubeval;150002;250;153;0.612
1528602171;pubeval;150002;250;188;0.752
1528602208;pubeval;150002;250;179;0.716
1528602922;dumbeval;152002;250;448;1.792
1528602966;dumbeval;152002;250;425;1.7
1528603008;dumbeval;152002;250;425;1.7
1528603043;pubeval;152002;250;206;0.824
1528603081;pubeval;152002;250;177;0.708
1528603117;pubeval;152002;250;206;0.824
1528603829;dumbeval;154002;250;441;1.764
1528603874;dumbeval;154002;250;436;1.744
1528603918;dumbeval;154002;250;441;1.764
1528603956;pubeval;154002;250;166;0.664
1528603994;pubeval;154002;250;198;0.792
1528604032;pubeval;154002;250;193;0.772
1528604756;dumbeval;156002;250;433;1.732
1528604799;dumbeval;156002;250;429;1.716
1528604844;dumbeval;156002;250;428;1.712
1528604880;pubeval;156002;250;180;0.72
1528604918;pubeval;156002;250;216;0.864
1528604956;pubeval;156002;250;198;0.792
1528605662;dumbeval;158002;250;423;1.692
1528605708;dumbeval;158002;250;406;1.624
1528605753;dumbeval;158002;250;436;1.744
1528605792;pubeval;158002;250;214;0.856
1528605829;pubeval;158002;250;190;0.76
1528605866;pubeval;158002;250;174;0.696
1528606583;dumbeval;160002;250;446;1.784
1528606628;dumbeval;160002;250;445;1.78
1528606672;dumbeval;160002;250;449;1.796
1528606710;pubeval;160002;250;200;0.8
1528606748;pubeval;160002;250;177;0.708
1528606786;pubeval;160002;250;202;0.808
1528607505;dumbeval;162002;250;426;1.704
1528607550;dumbeval;162002;250;431;1.724
1528607598;dumbeval;162002;250;438;1.752
1528607637;pubeval;162002;250;197;0.788
1528607673;pubeval;162002;250;192;0.768
1528607712;pubeval;162002;250;186;0.744
1528608435;dumbeval;164002;250;436;1.744
1528608479;dumbeval;164002;250;428;1.712
1528608524;dumbeval;164002;250;419;1.676
1528608562;pubeval;164002;250;156;0.624
1528608600;pubeval;164002;250;171;0.684
1528608638;pubeval;164002;250;181;0.724
1528609360;dumbeval;166002;250;417;1.668
1528609406;dumbeval;166002;250;435;1.74
1528609452;dumbeval;166002;250;439;1.756
1528609487;pubeval;166002;250;225;0.9
1528609524;pubeval;166002;250;204;0.816
1528609561;pubeval;166002;250;200;0.8
1528610271;dumbeval;168002;250;419;1.676
1528610315;dumbeval;168002;250;448;1.792
1528610358;dumbeval;168002;250;436;1.744
1528610394;pubeval;168002;250;222;0.888
1528610429;pubeval;168002;250;211;0.844
1528610466;pubeval;168002;250;198;0.792
1528611167;dumbeval;170002;250;435;1.74
1528611211;dumbeval;170002;250;436;1.744
1528611256;dumbeval;170002;250;435;1.74
1528611293;pubeval;170002;250;187;0.748
1528611330;pubeval;170002;250;206;0.824
1528611367;pubeval;170002;250;171;0.684
1528612079;dumbeval;172002;250;436;1.744
1528612122;dumbeval;172002;250;431;1.724
1528612165;dumbeval;172002;250;428;1.712
1528612203;pubeval;172002;250;188;0.752
1528612241;pubeval;172002;250;216;0.864
1528612278;pubeval;172002;250;207;0.828
1528612989;dumbeval;174002;250;441;1.764
1528613031;dumbeval;174002;250;432;1.728
1528613076;dumbeval;174002;250;430;1.72
1528613113;pubeval;174002;250;217;0.868
1528613150;pubeval;174002;250;172;0.688
1528613188;pubeval;174002;250;167;0.668
1528613882;dumbeval;176002;250;428;1.712
1528613925;dumbeval;176002;250;439;1.756
1528613970;dumbeval;176002;250;454;1.816
1528614010;pubeval;176002;250;198;0.792
1528614047;pubeval;176002;250;189;0.756
1528614085;pubeval;176002;250;178;0.712
1528614814;dumbeval;178002;250;448;1.792
1528614859;dumbeval;178002;250;420;1.68
1528614904;dumbeval;178002;250;435;1.74
1528614940;pubeval;178002;250;231;0.924
1528614978;pubeval;178002;250;176;0.704
1528615015;pubeval;178002;250;237;0.948
1528615734;dumbeval;180002;250;434;1.736
1528615777;dumbeval;180002;250;436;1.744
1528615822;dumbeval;180002;250;446;1.784
1528615859;pubeval;180002;250;194;0.776
1528615898;pubeval;180002;250;169;0.676
1528615936;pubeval;180002;250;174;0.696
1528616646;dumbeval;182002;250;428;1.712
1528616690;dumbeval;182002;250;428;1.712
1528616735;dumbeval;182002;250;432;1.728
1528616773;pubeval;182002;250;172;0.688
1528616812;pubeval;182002;250;231;0.924
1528616849;pubeval;182002;250;201;0.804
1528617565;dumbeval;184002;250;444;1.776
1528617608;dumbeval;184002;250;423;1.692
1528617652;dumbeval;184002;250;434;1.736
1528617689;pubeval;184002;250;175;0.7
1528617727;pubeval;184002;250;185;0.74
1528617765;pubeval;184002;250;210;0.84
1528618483;dumbeval;186002;250;427;1.708
1528618525;dumbeval;186002;250;442;1.768
1528618570;dumbeval;186002;250;428;1.712
1528618607;pubeval;186002;250;180;0.72
1528618643;pubeval;186002;250;224;0.896
1528618678;pubeval;186002;250;174;0.696
1528619394;dumbeval;188002;250;437;1.748
1528619438;dumbeval;188002;250;438;1.752
1528619480;dumbeval;188002;250;436;1.744
1528619519;pubeval;188002;250;176;0.704
1528619557;pubeval;188002;250;177;0.708
1528619595;pubeval;188002;250;219;0.876
1528620338;dumbeval;190002;250;437;1.748
1528620385;dumbeval;190002;250;446;1.784
1528620436;dumbeval;190002;250;425;1.7
1528620478;pubeval;190002;250;206;0.824
1528620517;pubeval;190002;250;222;0.888
1528620556;pubeval;190002;250;196;0.784
1528621307;dumbeval;192003;250;434;1.736
1528621351;dumbeval;192003;250;447;1.788
1528621392;dumbeval;192003;250;441;1.764
1528621429;pubeval;192003;250;187;0.748
1528621467;pubeval;192003;250;221;0.884
1528621505;pubeval;192003;250;205;0.82
1528622213;dumbeval;194003;250;421;1.684
1528622258;dumbeval;194003;250;430;1.72
1528622303;dumbeval;194003;250;435;1.74
1528622341;pubeval;194003;250;199;0.796
1528622378;pubeval;194003;250;170;0.68
1528622416;pubeval;194003;250;170;0.68
1528623126;dumbeval;196003;250;440;1.76
1528623172;dumbeval;196003;250;426;1.704
1528623219;dumbeval;196003;250;444;1.776
1528623258;pubeval;196003;250;199;0.796
1528623295;pubeval;196003;250;203;0.812
1528623334;pubeval;196003;250;178;0.712
1528624038;dumbeval;198003;250;448;1.792
1528624082;dumbeval;198003;250;439;1.756
1528624126;dumbeval;198003;250;446;1.784
1528624166;pubeval;198003;250;164;0.656
1528624203;pubeval;198003;250;192;0.768
1528624240;pubeval;198003;250;191;0.764
1528624964;dumbeval;200003;250;433;1.732
1528625008;dumbeval;200003;250;444;1.776
1528625053;dumbeval;200003;250;422;1.688
1528625090;pubeval;200003;250;196;0.784
1528625129;pubeval;200003;250;209;0.836
1528625167;pubeval;200003;250;182;0.728
1529014470;pubeval;202003;250;200;0.8
1529014688;pubeval;202003;250;198;0.792
1529014798;pubeval;202003;250;-196;-0.784
1529014923;pubeval;202003;250;-237;-0.948
1529015034;pubeval;202003;250;193;0.772
1529015648;pubeval;202003;250;199;0.796

View File

@ -1,107 +0,0 @@
1528532624;1;1;-2;-2.0;0.45343881845474243
1528533453;2001;2000;-1814;-0.907;0.685711669921875
1528534098;4001;2000;-608;-0.304;0.9842407836914062
1528534732;6001;2000;-503;-0.2515;1.063406005859375
1528535364;8001;2000;-453;-0.2265;1.0802667236328125
1528536002;10001;2000;-44;-0.022;1.09254541015625
1528536653;12001;2000;159;0.0795;1.0855897216796875
1528537323;14001;2000;230;0.115;1.0957244873046874
1528538003;16001;2000;189;0.0945;1.15757861328125
1528538688;18001;2000;376;0.188;1.1618570556640626
1528539399;20001;2000;518;0.259;1.1903975830078124
1528540130;22001;2000;547;0.2735;1.1916396484375
1528540902;24001;2000;237;0.1185;1.3655455322265626
1528541735;26001;2000;95;0.0475;1.4186490478515625
1528542589;28001;2000;212;0.106;1.4446763916015626
1528543444;30001;2000;389;0.1945;1.4773995361328125
1528544265;32001;2000;371;0.1855;1.436327880859375
1528545100;34001;2000;244;0.122;1.4622772216796875
1528545938;36001;2000;182;0.091;1.526433349609375
1528546785;38001;2000;159;0.0795;1.5337244873046876
1528547667;40001;2000;252;0.126;1.5359388427734375
1528548544;42001;2000;188;0.094;1.54842041015625
1528549450;44001;2000;246;0.123;1.614618896484375
1528550374;46001;2000;128;0.064;1.5962698974609375
1528551266;48001;2000;10;0.005;1.6010469970703125
1528552159;50001;2000;-83;-0.0415;1.582731201171875
1528553055;52001;2000;-66;-0.033;1.579044677734375
1528553979;54001;2000;-54;-0.027;1.630927978515625
1528554875;56001;2000;98;0.049;1.557802001953125
1528555773;58001;2000;-101;-0.0505;1.585782470703125
1528556671;60001;2000;-49;-0.0245;1.5916173095703126
1528557562;62001;2000;17;0.0085;1.6063782958984374
1528558447;64001;2000;38;0.019;1.5958663330078124
1528559339;66001;2000;9;0.0045;1.5874405517578125
1528560213;68001;2000;38;0.019;1.582672119140625
1528561114;70001;2000;114;0.057;1.6329345703125
1528562000;72001;2000;224;0.112;1.5919478759765624
1528562881;74001;2000;143;0.0715;1.6054395751953126
1528563806;76001;2000;273;0.1365;1.6047711181640625
1528564707;78001;2000;9;0.0045;1.5738580322265625
1528565591;80001;2000;-70;-0.035;1.6135865478515625
1528566574;82001;2000;59;0.0295;1.5992630615234376
1528567559;84001;2000;-12;-0.006;1.5602725830078126
1528568577;86001;2000;97;0.0485;1.5966063232421874
1528569574;88001;2000;87;0.0435;1.6054110107421875
1528570519;90001;2000;300;0.15;1.582406494140625
1528571469;92001;2000;140;0.07;1.59626611328125
1528572420;94001;2000;171;0.0855;1.610840576171875
1528573295;96001;2000;37;0.0185;1.5797979736328125
1528574204;98001;2000;-105;-0.0525;1.574399169921875
1528575242;100001;2000;-136;-0.068;1.5745897216796876
1528576133;102001;2000;63;0.0315;1.5833463134765624
1528577114;104001;2000;101;0.0505;1.6021649169921874
1528577998;106001;2000;-82;-0.041;1.581091552734375
1528578895;108001;2000;21;0.0105;1.583177734375
1528579781;110001;2000;130;0.065;1.5937142333984375
1528580648;112001;2000;101;0.0505;1.5705904541015625
1528581533;114001;2000;-26;-0.013;1.6102183837890625
1528582417;116001;2000;71;0.0355;1.6006197509765625
1528583331;118001;2000;79;0.0395;1.61836279296875
1528584217;120001;2000;12;0.006;1.5898594970703126
1528585094;122001;2000;115;0.0575;1.5761893310546875
1528585977;124001;2000;3;0.0015;1.567900390625
1528589910;124002;1;1;1.0;2.1990389823913574
1528590871;126002;2000;39;0.0195;1.55269384765625
1528591789;128002;2000;98;0.049;1.5829056396484376
1528592719;130002;2000;75;0.0375;1.584886474609375
1528593642;132002;2000;242;0.121;1.5954697265625
1528594563;134002;2000;81;0.0405;1.577446044921875
1528595479;136002;2000;129;0.0645;1.582875732421875
1528596406;138002;2000;115;0.0575;1.6050557861328125
1528597356;140002;2000;184;0.092;1.589245361328125
1528598297;142002;2000;191;0.0955;1.6228802490234375
1528599207;144002;2000;10;0.005;1.6130411376953124
1528600149;146002;2000;50;0.025;1.6085372314453126
1528601062;148002;2000;11;0.0055;1.59520458984375
1528601967;150002;2000;152;0.076;1.5749661865234375
1528602880;152002;2000;108;0.054;1.6114630126953124
1528603784;154002;2000;305;0.1525;1.5742574462890624
1528604711;156002;2000;42;0.021;1.5828121337890626
1528605617;158002;2000;98;0.049;1.589599609375
1528606539;160002;2000;197;0.0985;1.5930859375
1528607459;162002;2000;280;0.14;1.5736075439453125
1528608390;164002;2000;105;0.0525;1.5925037841796874
1528609316;166002;2000;160;0.08;1.5934769287109376
1528610227;168002;2000;213;0.1065;1.5652880859375
1528611123;170002;2000;6;0.003;1.564364013671875
1528612035;172002;2000;26;0.013;1.5828829345703126
1528612943;174002;2000;50;0.025;1.581922119140625
1528613838;176002;2000;140;0.07;1.589693115234375
1528614766;178002;2000;107;0.0535;1.5945419921875
1528615689;180002;2000;80;0.04;1.5782044677734375
1528616601;182002;2000;19;0.0095;1.574864013671875
1528617520;184002;2000;-73;-0.0365;1.565394287109375
1528618437;186002;2000;25;0.0125;1.6035906982421875
1528619351;188002;2000;132;0.066;1.61888134765625
1528620292;190002;2000;109;0.0545;1.6021832275390624
1528621254;192002;2000;41;0.0205;1.5614134521484375
1528621264;192003;1;1;1.0;1.1434392929077148
1528622166;194003;2000;-20;-0.01;1.5994180908203126
1528623082;196003;2000;73;0.0365;1.577033935546875
1528623993;198003;2000;57;0.0285;1.5527196044921876
1528624919;200003;2000;29;0.0145;1.5869425048828125
1528625838;202003;2000;114;0.057;1.61248779296875
1529017158;202604;1;-2;-2.0;1.1881717443466187
1529017166;202605;1;2;2.0;0.9586520195007324
1529017177;202615;10;1;0.1;2.0276899337768555

Some files were not shown because too many files have changed in this diff Show More