-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Expected Behavior
No memory leak
Actual Behavior
A huge amount of memory that prevents the optimization process
Steps to Reproduce
Create the folling class
class MyMemoryDumper(object):
def __init__(self):
self.every = 2000
self.count = 0
def __call__(self, res):
if self.count != 0 and self.count % self.every == 0:
obj_list = gc.get_objects()
new_values = {}
for obj in obj_list:
size_obj = sys.getsizeof(obj)
new_values[str(obj)] = size_obj
new_values = sorted(new_values)
print('end')
self.count += 1
Add it to base_estimator
forest_minimize(
base_estimator=ExtraTreesRegressor(n_estimators=20, min_samples_leaf=2),
callback=[DeltaXStopper(9e-7), MyMemoryDumper()],
Additional info
Add a breakpoint in MyMemoryDumper::print('end')
. There are a lot of DataFrames from self.run
Backtesting==0.3.3
OS: Windows
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working