Алгоритм максимизации прибыли от доставки с ограничениями по массе и стоимости

#python #algorithm #optimization #mathematical-optimization #mystic

Вопрос:

Название не очень помогает, потому что я не уверен, что именно пытаюсь сказать. Я уверен, что алгоритм для этого должен существовать, но я не могу вспомнить. Примечание: это не проблема с домашним заданием, я очень давно закончил школу.

Итак, вот в чем проблема:

  • Мы занимаемся доставкой и торговлей, пытаясь максимизировать прибыль
  • У нас есть список товаров, которые мы можем отправить в грузовике. Каждый предмет имеет:
    • Цена покупки (у источника)
    • Цена продажи (в пункте назначения)
    • А на единицу массы
    • Верхний предел того, сколько их можно приобрести
  • Наш грузовик ограничен по массе, которую он может перевозить
  • У нас есть верхний предел того, сколько нам разрешено «инвестировать» (тратить на товары у источника).
  • Мы хотим максимизировать прибыль для нашей работы (покупать у источника, перевозить, продавать в пункте назначения).

Если бы был только один предел (общая масса или общие инвестиции), это было бы легко, но я не уверен, как подойти к этому, когда их два.

Уравнение для расчета прибыли было бы следующим:

 profit = ItemA['quantity'] * (ItemA['sell_price'] - ItemA['buy_price'])   ItemB['quantity'] * (ItemB['sell_price'] - ItemB['buy_price'])   ...
 

Поэтому я пытаюсь выбрать, какие товары и количество каждого товара следует приобрести, чтобы максимизировать прибыль.

Существуют ли какие-либо существующие, известные алгоритмы для решения этой проблемы? Вероятно, какая-то проблема математической оптимизации? Я использую Python, поэтому я думаю, что пакет mystic может подойти, но я не уверен, как бы я его настроил.

Комментарии:

1. Это проблема ограниченного рюкзака . Ценность предмета такова sell_price - buy_price . Вес-это удельная масса. И у вас есть ограничение на количество каждого товара и ограничение на общий вес.

2. На самом деле это 2-мерный ограниченный рюкзак, так как наш фактический вес представляет собой 2D-вектор (вес, цена покупки) и имеет ограничение на сумму в каждом измерении. С точки зрения вычислений его, предположительно, намного сложнее приблизить, чем традиционный рюкзак 1D. Нам нужна дополнительная информация об ограничениях: сколько предметов, максимальный вес/цены, так как это NP-сложная проблема. Он также может больше подходить для cs.stackexchange

3. @kcsquared Мы могли бы ограничить его максимум 10 различными предметами. Вес и цена за единицу товара по существу неограниченны, могут составлять от 0,01 кг до 1000 кг и от 0,01 до 1 мм долларов.

4. 10 разных предметов? Просто бросьте в него решатель целочисленных программ. Я использую инструменты ИЛИ на работе, но у вас есть варианты.

5. У @Erwin-Kalvelagen есть пример многомерной модели рюкзака на yetanothermathprogrammingconsultant.blogspot.com/2016/01/…

Ответ №1:

Вы можете попробовать фреймворк optuna для настройки гиперпараметров.

Вот пример кода, который вы можете попробовать. Продукты называются product1 и т.д., найденные в файле parameters.json. Значения данных-это всего лишь предположения.

Сеанс изучения/оптимизации теперь сохраняется в базе данных sqlite. Это будет поддерживать прерывание и возобновление. Смотрите журнал версий в коде.

параметры.json

 {
    "study_name": "st5_tpe",
    "sampler": "tpe",
    "trials": 1000,
    "max_purchase": 7000,
    "min_weight_no_cost": 1000,
    "high_weight_additional_cost": 0.5,
    "trucks": {
        "smalltruck": {
            "maxmass": 1000,
            "cost": 75
        },
        "mediumtruck": {
            "maxmass": 2000,
            "cost": 150
        },
        "bigtruck": {
            "maxmass": 5000,
            "cost": 400
        }
    },
    "products": {
        "product1_qty": {
            "min": 20,
            "max": 100,
            "massperunit": 2,
            "buyprice": 5,
            "sellprice": 8
        },
        "product2_qty": {
            "min": 20,
            "max": 100,
            "massperunit": 4,
            "buyprice": 6,
            "sellprice": 10
        },
        "product3_qty": {
            "min": 20,
            "max": 100,
            "massperunit": 1,
            "buyprice": 4,
            "sellprice": 6
        },
        "product4_qty": {
            "min": 20,
            "max": 100,
            "massperunit": 2,
            "buyprice": 7,
            "sellprice": 10
        },
        "product5_qty": {
            "min": 20,
            "max": 100,
            "massperunit": 2,
            "buyprice": 5,
            "sellprice": 8
        },
        "product6_qty": {
            "min": 20,
            "max": 100,
            "massperunit": 1,
            "buyprice": 5,
            "sellprice": 7
        },
        "product7_qty": {
            "min": 20,
            "max": 100,
            "massperunit": 1,
            "buyprice": 8,
            "sellprice": 12
        }
    }
}

 

Код

 """
shipping_trading.py


version 0.7.0
    * Calculate and show ROI (return of investment) and other info.
    * Add user attribute to get other costs.
    * Raise exception when max_purchase key is missing in parameters.json file.
    * Continue the study even when trucks key is missing in parameters.json file.
    
version 0.6.0
    * Save study/optimization session in sqlite db, with this it can now supports interrupt and resume.
      When study session is interrupted it can be resumed later using data from previous session.
    * Add study_name key in parameters.json file. Sqlite db name is based on study_name. If you
      want new study/optimization session, modify the study_name. If you are re-running the
      same study_name, it will run and continue from previous session. Example:
      study_name=st8, sqlite_dbname=mydb_st8.db
      By default study_name is example_study when you remove study_name key in parameters.json file.
    * Remove printing in console on truck info.

version 0.5.0
    * Replace kg with qty in parameters.json file.
    * Add massperunit in the product.
    * Optimize qty not mass.
    * Refactor

version 0.4.0
    * Add truck size optimization. It is contrained by the cost of using truck as well as the max kg capacity.
      The optimizer may suggest a medium instead of a big truck if profit is higher as big truck is expensive.
      profit = profit - truck_cost - other_costs
    * Modify parameters.json file, trucks key is added.

version 0.3.0
    * Read sampler, and number of trials from parameters.json file.
      User inputs can now be processed from that file.

version 0.2.0
    * Read a new parameters.json format.
    * Refactor get_parameters().

version 0.1.0
    * Add additional cost if total product weight is high.
"""


__version__ = '0.7.0'


import json

import optuna


def get_parameters():
    """
    Read parameters.json file to get the parameters to optimize, etc.
    """
    fn = 'parameters.json'
    products, trucks = {}, {}

    with open(fn) as json_file:
        values = json.load(json_file)

        max_purchase = values.get('max_purchase', None)
        if max_purchase is None:
            raise Exception('Missing max_purchase, please specify max_purchase in json file, i.e "max_purchase": 1000')

        study_name = values.get('study_name', "example_study")
        sampler = values.get('sampler', "tpe")
        trials = values.get('trials', 100)
        min_weight_no_cost = values.get('min_weight_no_cost', None)
        high_weight_additional_cost = values.get('high_weight_additional_cost', None)
        products = values.get('products', None)
        trucks = values.get('trucks', None)

    return (products, trucks, sampler, trials, max_purchase, min_weight_no_cost, high_weight_additional_cost, study_name)


def objective(trial):
    """
    Maximize profit.
    """
    gp = get_parameters()
    (products, trucks, _, _, max_purchase,
        min_weight_no_cost, high_weight_additional_cost, _) = gp

    # Ask the optimizer the product qty to use try.
    new_param = {}    
    for k, v in products.items():
        suggested_value = trial.suggest_int(k, v['min'], v['max'])  # get suggested value from sampler
        new_param.update({k: {'suggested': suggested_value,
                               'massperunit': v['massperunit'],
                               'buyprice': v['buyprice'],
                               'sellprice': v['sellprice']}})

    # Ask the sampler which truck to use, small, medium ....
    truck_max_wt, truck_cost = None, None
    if trucks is not None:
        truck = trial.suggest_categorical("truck", list(trucks.keys()))

        # Define truck limits based on suggested truck size.
        truck_max_wt = trucks[truck]['maxmass']
        truck_cost = trucks[truck]['cost']

    # If total wt or total amount is exceeded, we return a 0 profit.
    total_wt, total_buy, profit = 0, 0, 0
    for k, v in new_param.items():
        total_wt  = v['suggested'] * v['massperunit']
        total_buy  = v['suggested'] * v['buyprice']
        profit  = v['suggested'] * (v['sellprice'] - v['buyprice'])

    # (1) Truck mass limit
    if truck_max_wt is not None:
        if total_wt > truck_max_wt:
            return 0

    # (2) Purchase limit amount
    if max_purchase is not None:
        if total_buy > max_purchase:
            return 0

    # Cost for higher transport weight
    cost_high_weight = 0
    if min_weight_no_cost is not None and high_weight_additional_cost is not None:
        excess_weight = total_wt - min_weight_no_cost
        if excess_weight > 0:
            cost_high_weight  = (total_wt - min_weight_no_cost) * high_weight_additional_cost

    # Cost for using a truck, can be small, medium etc.
    cost_truck_usage = 0
    if truck_cost is not None:
        cost_truck_usage  = truck_cost

    # Total cost
    other_costs = cost_high_weight   cost_truck_usage
    trial.set_user_attr("other_costs", other_costs)

    # Adjust profit
    profit = profit - other_costs

    # Send this profit to optimizer so that it will consider this value
    # in its optimization algo and would suggest a better value next time we ask again.
    return profit


def return_of_investment(study, products):
    """
    Returns ROI.

    ROI = Return Of Investment
    ROI = 100 * profit/costs
    """
    product_sales, product_costs = 0, 0
    for (k, v), (k1, v1) in zip(products.items(), study.best_params.items()):
        if k == 'truck':
            continue
        assert k == k1
        product_sales  = v1 * v['sellprice']
        product_costs  = v1 * v['buyprice']
        
    other_costs = study.best_trial.user_attrs['other_costs']
    total_costs = product_costs   other_costs

    calculated_profit = product_sales - total_costs
    study_profit = study.best_trial.values[0]
    assert calculated_profit == study_profit
    
    return_of_investment = 100 * calculated_profit/total_costs

    return return_of_investment, product_sales, product_costs, other_costs


def main():
    # Read parameters.json file for user data input.
    gp = get_parameters()
    (products, trucks, optsampler, num_trials,
        max_purchase, _, _, study_name) = gp

    # Location of sqlite db where optimization session data are saved.
    sqlite_dbname = f'sqlite:///mydb_{study_name}.db'

    # Available samplers to use:
    # https://optuna.readthedocs.io/en/stable/reference/samplers.html
    # https://optuna.readthedocs.io/en/stable/reference/generated/optuna.integration.SkoptSampler.html
    # https://optuna.readthedocs.io/en/stable/reference/generated/optuna.integration.BoTorchSampler.html
    if optsampler.lower() == 'cmaes':
        sampler = optuna.samplers.CmaEsSampler(n_startup_trials=1, seed=100)
    elif optsampler.lower() == 'tpe':
        sampler = optuna.samplers.TPESampler(n_startup_trials=10, multivariate=False, group=False, seed=100, n_ei_candidates=24)
    else:
        print(f'Warning, {optsampler} is not supported, we will be using tpe sampler instead.')
        optsampler = 'tpe'
        sampler = optuna.samplers.TPESampler(n_startup_trials=10, multivariate=False, group=False, seed=100, n_ei_candidates=24)

    # Store optimization in storage and supports interrupt/resume.
    study = optuna.create_study(storage=sqlite_dbname, sampler=sampler, study_name=study_name, load_if_exists=True, direction='maximize')
    study.optimize(objective, n_trials=num_trials)

    # Show summary and best parameter values to maximize profit.
    print()
    print(f'study_name: {study_name}')
    print(f'sqlite dbname: {sqlite_dbname}')
    print(f'sampler: {optsampler}')
    print(f'trials: {num_trials}')
    print()

    print(f'Max Purchase Amount: {max_purchase}')
    print()

    print('Products being optimized:')
    for k, v in products.items():
        print(f'{k}: {v}')
    print()

    if trucks is not None:
        print('Trucks being optimized:')
        for k, v in trucks.items():
            print(f'{k}: {v}')
        print()

    print('Study/Optimization results:')
    objective_name = 'profit'
    print(f'best parameter value : {study.best_params}')
    print(f'best value           : {study.best_trial.values[0]}')
    print(f'best trial           : {study.best_trial.number}')
    print(f'objective            : {objective_name}')
    print()

    # Show other info like roi, etc.
    roi, product_sales, product_costs, other_costs = return_of_investment(study, products)
    print('Other info.:')    
    print(f'Return Of Investment : {roi:0.2f}%, profit/costs')
    print(f'Product Sales        : {product_sales:0.2f}')
    print(f'Product Costs        : {product_costs:0.2f}')
    print(f'Other Costs          : {other_costs:0.2f}')
    print(f'Total Costs          : {product_costs   other_costs:0.2f}')
    print(f'Profit               : {product_sales - (product_costs   other_costs):0.2f}')
    print(f'Capital              : {max_purchase:0.2f}')
    print(f'Total Spent          : {product_costs   other_costs:0.2f} ({100*(product_costs   other_costs)/max_purchase:0.2f}% of Capital)')
    print(f'Capital Balance      : {max_purchase - product_costs - other_costs:0.2f}')
    print()


if __name__ == '__main__':
    main()

 

Выход

 study_name: st5_tpe
sqlite dbname: sqlite:///mydb_st5_tpe.db
sampler: tpe
trials: 1000

Max Purchase Amount: 7000

Products being optimized:
product1_qty: {'min': 20, 'max': 100, 'massperunit': 2, 'buyprice': 5, 'sellprice': 8}
product2_qty: {'min': 20, 'max': 100, 'massperunit': 4, 'buyprice': 6, 'sellprice': 10}
product3_qty: {'min': 20, 'max': 100, 'massperunit': 1, 'buyprice': 4, 'sellprice': 6}
product4_qty: {'min': 20, 'max': 100, 'massperunit': 2, 'buyprice': 7, 'sellprice': 10}
product5_qty: {'min': 20, 'max': 100, 'massperunit': 2, 'buyprice': 5, 'sellprice': 8}
product6_qty: {'min': 20, 'max': 100, 'massperunit': 1, 'buyprice': 5, 'sellprice': 7}
product7_qty: {'min': 20, 'max': 100, 'massperunit': 1, 'buyprice': 8, 'sellprice': 12}

Trucks being optimized:
smalltruck: {'maxmass': 1000, 'cost': 75}
mediumtruck: {'maxmass': 2000, 'cost': 150}
bigtruck: {'maxmass': 5000, 'cost': 400}

Study/Optimization results:
best parameter value : {'product1_qty': 99, 'product2_qty': 96, 'product3_qty': 93, 'product4_qty': 96, 'product5_qty': 100, 'product6_qty': 100, 'product7_qty': 100, 'truck': 'mediumtruck'}
best value           : 1771.5
best trial           : 865
objective            : profit

Other info.:
Return Of Investment : 42.19%, profit/costs
Product Sales        : 5970.00
Product Costs        : 3915.00
Other Costs          : 283.50
Total Costs          : 4198.50
Profit               : 1771.50
Capital              : 7000.00
Total Spent          : 4198.50 (59.98% of Capital)
Capital Balance      : 2801.50
 

Если вы увеличите количество испытаний, программа, возможно, сможет найти более выгодные значения параметров.

Комментарии:

1. Я попробовал это сделать, но, к сожалению, это было невероятно медленно. Тем не менее, спасибо вам за отличные образцы кода.

2. Это действительно может быть медленным, особенно если у вас больше продуктов и большой ассортимент или (макс-мин). Можете ли вы привести пример количества параметров и диапазонов количества. Этот выбор грузовиков также способствует более медленной оптимизации. Вы пробовали другое решение с помощью scipy?

3. Я еще не пробовал scipy, но я попробовал MIP с помощью OR-Инструментов (предложенных в комментарии к моему первоначальному вопросу), и все прошло довольно быстро.

4. Правильно, я протестировал ortools, и это действительно очень быстро. scipy также очень быстр.

Ответ №2:

Другой вариант-использовать scipy. Приведенный ниже образец содержит 3 продукта, которые, конечно, можно масштабировать. Ограничениями являются покупка и максимальная грузоподъемность грузовика.

код

 """
shipping_trading_solver.py

Ref: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html#scipy.optimize.minimize
"""


from scipy.optimize import minimize


# Constants
sellprice = [8, 7, 10]
buyprice = [6, 5, 6]
mass_per_unit = [1, 2, 3]

purchase_limit = 100
truck_mass_limit = 70


def objective(x):
    """
    objective, return value as negative to maximize.
    x: quantity
    """
    profit = 0
    for (v, s, b) in zip(x, sellprice, buyprice):
        profit  = v * (s - b)

    return -profit


def purchase_cons(x):
    """
    Used for constrain
    x: quantity
    """
    purchases = 0
    for (v, b) in zip(x, buyprice):
        purchases  = v * b
    
    return purchase_limit - purchases  # not negative


def mass_cons(x):
    """
    Used for constrain
    mass = qty * mass/qty
    x: quantity
    """
    mass = 0
    for (v, m) in zip(x, mass_per_unit):
        mass  = v * m
    
    return truck_mass_limit - mass  # not negative


def profit_cons(x):
    """
    Used for constrain
    x: quantity
    """
    profit = 0
    for (v, s, b) in zip(x, sellprice, buyprice):
        profit  = v * (s - b)

    return profit  # not negative


def main():
    # Define constrained. Note: ineq=non-negative, eq=zero
    cons = (
        {'type': 'ineq', 'fun': purchase_cons},
        {'type': 'ineq', 'fun': mass_cons},
        {'type': 'ineq', 'fun': profit_cons}
    )

    # Bounds of product quantity, (min,max)
    bound = ((0, 50), (0, 20), (0, 30))

    # Initial values
    init_values = (0, 0, 0)

    # Start minimizing
    # SLSQP = Sequential Least Squares Programming
    res = minimize(objective, init_values, method='SLSQP', bounds=bound, constraints=cons)

    # Show summary
    print('Results summary:')
    print(f'optimization message: {res.message}')
    print(f'success status: {res.success}')
    print(f'profit: {sum([(s-b) * int(x) for (x, s, b) in zip(res.x, sellprice, buyprice)]):0.1f}')
    print(f'best param values: {[int(v) for v in res.x]}')
    print()

    # Verify results
    print('Verify purchase and mass limits:')

    # (1) purchases
    total_purchases = 0
    for (qty, b) in zip(res.x, buyprice):
        total_purchases  = int(qty) * b
    print(f'actual total_purchases: {total_purchases:0.1f}, purchase_limit: {purchase_limit}')

    # (2) mass
    total_mass = 0    
    for (qty, m) in zip(res.x, mass_per_unit):
        total_mass  = int(qty) * m
    print(f'actual total_mass: {total_mass:0.1f}, truck_mass_limit: {truck_mass_limit}')


if __name__ == '__main__':
    main()

 

выход

 Results summary:
optimization message: Optimization terminated successfully
success status: True
profit: 64.0
best param values: [0, 0, 16]

Verify purchase and mass limits:
actual total_purchases: 96.0, purchase_limit: 100
actual total_mass: 48.0, truck_mass_limit: 70
 

Ответ №3:

Я mystic -автор. Во-первых, mystic это не лучший код для использования с этой проблемой… хороший линейный MIP-решатель, подобный тому OR-Tools , который был бы лучшим выбором. Mystic будет надежно решать проблемы MIP/LP, просто не так быстро, как OR-Инструменты. С точки зрения скорости, mystic это примерно так же быстро, как scipy.optimize . Mystic может замедляться по мере того, как ограничения становятся более нелинейными, сложными и жестко ограниченными (обратите внимание, что в этом случае другие коды обычно терпят неудачу, а mystic это не так). Ниже я буду использовать решатель дифференциальной эволюции (который медленнее, но более надежен, чем SLSQP).

Обратите внимание, что как только у вас есть одно или несколько нелинейных ограничений, вам обязательно следует использовать mystic … as mystic , созданный для глобальной оптимизации с нелинейными ограничениями. Или, если вместо модели фиксированного ценообразования у вас есть некоторые рыночные колебания в модели и, следовательно, неопределенность… и хотите максимизировать ожидаемую прибыль, а еще лучше построить модель прибыли, которая минимизирует риск, тогда вам определенно стоит воспользоваться mystic . OR-tools и другие коды LP/QP должны будут, в лучшем случае, аппроксимировать проблему как линейную или квадратичную, что может оказаться непрактичным.

Несмотря на. как вы спрашивали об использовании mystic этой проблемы, вот один из многих способов найти решение с помощью mystic :

 import mystic as my
import mystic.symbolic as ms
import mystic.constraints as mc

class item(object):
    def __init__(self, id, mass, buy, net, limit):
        self.id = id
        self.mass = mass
        self.buy = buy
        self.net = net
        self.limit = limit
    def __repr__(self):
        return 'item(%s, mass=%s, buy=%s, net=%s, limit=%s)' % (self.id, self.mass, self.buy, self.net, self.limit)

# data
masses = [10, 15, 20, 18, 34, 75, 11, 49, 68, 55]
buys = [123, 104, 149, 175, 199, 120, 164, 136, 194, 111]
nets = [13, 24, 10, 29, 29, 39, 28, 35, 33, 39]
limits = [300, 500, 200, 300, 200, 350, 100, 600, 1000, 50]
ids = range(len(limits))

# maxima
_load = 75000  # max limit on mass can carry
_spend = 350000  # max limit to spend at source

# items
items = [item(*i) for i in zip(ids, masses, buys, nets, limits)]

 # profit
def fixnet(net):
    def profit(x):
        return sum(xi*pi for xi,pi in zip(x,net))
    return profit

profit = fixnet([i.net for i in items])

# item constraints
load = [i.mass for i in items]
invest = [i.buy for i in items]
constraints = ms.linear_symbolic(G=[load, invest], h=[_load, _spend])

# bounds (on x)
bounds = [(0, i.limit) for i in items]

# bounds constraints
lo = 'x%s >= %s'
lo = 'n'.join(lo % (i,str(float(j[0])).lstrip('0')) for (i,j) in enumerate(bounds))
hi = 'x%s <= %s'
hi = 'n'.join(hi % (i,str(float(j[1])).lstrip('0')) for (i,j) in enumerate(bounds))
constraints = 'n'.join([lo, hi]).strip()   'n'   constraints
pf = ms.generate_penalty(ms.generate_conditions(ms.simplify(constraints)))

# integer constraints
cf = mc.integers(float)(lambda x:x)

# solve
mon = my.monitors.VerboseMonitor(1, 10)
results = my.solvers.diffev2(lambda x: -profit(x), bounds, npop=400, bounds=bounds, ftol=1e-4, gtol=50, itermon=mon, disp=True, full_output=True, constraints=cf, penalty=pf)

print ('nmax profit: %s' % -results[1])
print("load: %s <= %s" % (sum(i*j for i,j in zip(results[0], load)), _load))
print("spend: %s <= %s" % (sum(i*j for i,j in zip(results[0], invest)), _spend))
print('')

for item,quantity in enumerate(results[0]):
  print("item %s: %s" % (item,quantity))
 

с результатами:

 dude@borel>$ python knapsack.py 
Generation 0 has ChiSquare: -58080.000000
Generation 0 has fit parameters:
 [139.0, 413.0, 100.0, 271.0, 136.0, 344.0, 86.0, 404.0, 103.0, 5.0]
Generation 1 has ChiSquare: -58080.000000
Generation 2 has ChiSquare: -58080.000000
Generation 3 has ChiSquare: -58080.000000
Generation 4 has ChiSquare: -58080.000000
Generation 5 has ChiSquare: -58080.000000
Generation 6 has ChiSquare: -58080.000000
Generation 7 has ChiSquare: -58080.000000
Generation 8 has ChiSquare: -58080.000000
Generation 9 has ChiSquare: -58080.000000
Generation 10 has ChiSquare: -58080.000000
Generation 10 has fit parameters:
 [139.0, 413.0, 100.0, 271.0, 136.0, 344.0, 86.0, 404.0, 103.0, 5.0]
Generation 11 has ChiSquare: -58603.000000
Generation 12 has ChiSquare: -58603.000000
Generation 13 has ChiSquare: -58603.000000
Generation 14 has ChiSquare: -58603.000000
Generation 15 has ChiSquare: -58603.000000
Generation 16 has ChiSquare: -58603.000000
Generation 17 has ChiSquare: -58603.000000
Generation 18 has ChiSquare: -58607.000000
Generation 19 has ChiSquare: -58607.000000
Generation 20 has ChiSquare: -58607.000000
Generation 20 has fit parameters:
 [198.0, 406.0, 28.0, 256.0, 139.0, 239.0, 43.0, 472.0, 185.0, 36.0]
Generation 21 has ChiSquare: -59118.000000
Generation 22 has ChiSquare: -59944.000000
Generation 23 has ChiSquare: -59944.000000
Generation 24 has ChiSquare: -59944.000000
Generation 25 has ChiSquare: -59944.000000
Generation 26 has ChiSquare: -59944.000000
Generation 27 has ChiSquare: -59944.000000
Generation 28 has ChiSquare: -59944.000000
Generation 29 has ChiSquare: -60765.000000
Generation 30 has ChiSquare: -60765.000000
Generation 30 has fit parameters:
 [214.0, 430.0, 24.0, 295.0, 154.0, 123.0, 77.0, 541.0, 219.0, 33.0]
Generation 31 has ChiSquare: -60765.000000
Generation 32 has ChiSquare: -60765.000000
Generation 33 has ChiSquare: -60765.000000
Generation 34 has ChiSquare: -60765.000000
Generation 35 has ChiSquare: -60765.000000
Generation 36 has ChiSquare: -61045.000000
Generation 37 has ChiSquare: -61045.000000
Generation 38 has ChiSquare: -61045.000000
Generation 39 has ChiSquare: -61045.000000
Generation 40 has ChiSquare: -61045.000000
Generation 40 has fit parameters:
 [296.0, 496.0, 112.0, 292.0, 144.0, 136.0, 94.0, 347.0, 315.0, 27.0]
Generation 41 has ChiSquare: -61045.000000
Generation 42 has ChiSquare: -61045.000000
Generation 43 has ChiSquare: -61045.000000
Generation 44 has ChiSquare: -61045.000000
Generation 45 has ChiSquare: -61045.000000
Generation 46 has ChiSquare: -61045.000000
Generation 47 has ChiSquare: -61045.000000
Generation 48 has ChiSquare: -61045.000000
Generation 49 has ChiSquare: -62106.000000
Generation 50 has ChiSquare: -62106.000000
Generation 50 has fit parameters:
 [295.0, 470.0, 114.0, 216.0, 170.0, 73.0, 83.0, 598.0, 225.0, 29.0]
Generation 51 has ChiSquare: -62106.000000
Generation 52 has ChiSquare: -62106.000000
Generation 53 has ChiSquare: -62106.000000
Generation 54 has ChiSquare: -62106.000000
Generation 55 has ChiSquare: -62106.000000
Generation 56 has ChiSquare: -62224.000000
Generation 57 has ChiSquare: -62224.000000
Generation 58 has ChiSquare: -62224.000000
Generation 59 has ChiSquare: -62224.000000
Generation 60 has ChiSquare: -62224.000000
Generation 60 has fit parameters:
 [247.0, 441.0, 38.0, 288.0, 200.0, 175.0, 85.0, 499.0, 206.0, 11.0]
Generation 61 has ChiSquare: -62224.000000
Generation 62 has ChiSquare: -62224.000000
Generation 63 has ChiSquare: -62224.000000
Generation 64 has ChiSquare: -62224.000000
Generation 65 has ChiSquare: -62224.000000
Generation 66 has ChiSquare: -62224.000000
Generation 67 has ChiSquare: -62224.000000
Generation 68 has ChiSquare: -62224.000000
Generation 69 has ChiSquare: -62224.000000
Generation 70 has ChiSquare: -62224.000000
Generation 70 has fit parameters:
 [247.0, 441.0, 38.0, 288.0, 200.0, 175.0, 85.0, 499.0, 206.0, 11.0]
Generation 71 has ChiSquare: -63795.000000
Generation 72 has ChiSquare: -63795.000000
Generation 73 has ChiSquare: -63795.000000
Generation 74 has ChiSquare: -63795.000000
Generation 75 has ChiSquare: -63795.000000
Generation 76 has ChiSquare: -63795.000000
Generation 77 has ChiSquare: -63795.000000
Generation 78 has ChiSquare: -63795.000000
Generation 79 has ChiSquare: -63795.000000
Generation 80 has ChiSquare: -63795.000000
Generation 80 has fit parameters:
 [263.0, 494.0, 110.0, 293.0, 198.0, 246.0, 92.0, 529.0, 52.0, 20.0]
Generation 81 has ChiSquare: -63795.000000
Generation 82 has ChiSquare: -63795.000000
Generation 83 has ChiSquare: -63795.000000
Generation 84 has ChiSquare: -63795.000000
Generation 85 has ChiSquare: -63795.000000
Generation 86 has ChiSquare: -63795.000000
Generation 87 has ChiSquare: -63795.000000
Generation 88 has ChiSquare: -63795.000000
Generation 89 has ChiSquare: -63795.000000
Generation 90 has ChiSquare: -63795.000000
Generation 90 has fit parameters:
 [263.0, 494.0, 110.0, 293.0, 198.0, 246.0, 92.0, 529.0, 52.0, 20.0]
Generation 91 has ChiSquare: -63795.000000
Generation 92 has ChiSquare: -63795.000000
Generation 93 has ChiSquare: -63795.000000
Generation 94 has ChiSquare: -63795.000000
Generation 95 has ChiSquare: -63795.000000
Generation 96 has ChiSquare: -63795.000000
Generation 97 has ChiSquare: -63795.000000
Generation 98 has ChiSquare: -63795.000000
Generation 99 has ChiSquare: -63795.000000
Generation 100 has ChiSquare: -63795.000000
Generation 100 has fit parameters:
 [263.0, 494.0, 110.0, 293.0, 198.0, 246.0, 92.0, 529.0, 52.0, 20.0]
Generation 101 has ChiSquare: -63795.000000
Generation 102 has ChiSquare: -64252.000000
Generation 103 has ChiSquare: -64252.000000
Generation 104 has ChiSquare: -64252.000000
Generation 105 has ChiSquare: -64252.000000
Generation 106 has ChiSquare: -64252.000000
Generation 107 has ChiSquare: -64252.000000
Generation 108 has ChiSquare: -64252.000000
Generation 109 has ChiSquare: -64252.000000
Generation 110 has ChiSquare: -64252.000000
Generation 110 has fit parameters:
 [279.0, 479.0, 96.0, 295.0, 192.0, 182.0, 83.0, 582.0, 84.0, 38.0]
Generation 111 has ChiSquare: -64252.000000
Generation 112 has ChiSquare: -64252.000000
Generation 113 has ChiSquare: -64252.000000
Generation 114 has ChiSquare: -64252.000000
Generation 115 has ChiSquare: -64252.000000
Generation 116 has ChiSquare: -64252.000000
Generation 117 has ChiSquare: -64252.000000
Generation 118 has ChiSquare: -64252.000000
Generation 119 has ChiSquare: -64252.000000
Generation 120 has ChiSquare: -64252.000000
Generation 120 has fit parameters:
 [279.0, 479.0, 96.0, 295.0, 192.0, 182.0, 83.0, 582.0, 84.0, 38.0]
Generation 121 has ChiSquare: -64252.000000
Generation 122 has ChiSquare: -64252.000000
Generation 123 has ChiSquare: -64252.000000
Generation 124 has ChiSquare: -64368.000000
Generation 125 has ChiSquare: -64368.000000
Generation 126 has ChiSquare: -64368.000000
Generation 127 has ChiSquare: -64368.000000
Generation 128 has ChiSquare: -64368.000000
Generation 129 has ChiSquare: -64368.000000
Generation 130 has ChiSquare: -64368.000000
Generation 130 has fit parameters:
 [292.0, 493.0, 144.0, 297.0, 174.0, 195.0, 75.0, 586.0, 53.0, 43.0]
Generation 131 has ChiSquare: -64368.000000
Generation 132 has ChiSquare: -64368.000000
Generation 133 has ChiSquare: -64368.000000
Generation 134 has ChiSquare: -64368.000000
Generation 135 has ChiSquare: -64368.000000
Generation 136 has ChiSquare: -64368.000000
Generation 137 has ChiSquare: -64368.000000
Generation 138 has ChiSquare: -64368.000000
Generation 139 has ChiSquare: -64735.000000
Generation 140 has ChiSquare: -64735.000000
Generation 140 has fit parameters:
 [289.0, 483.0, 54.0, 293.0, 184.0, 263.0, 95.0, 599.0, 0.0, 29.0]
Generation 141 has ChiSquare: -64735.000000
Generation 142 has ChiSquare: -64735.000000
Generation 143 has ChiSquare: -64735.000000
Generation 144 has ChiSquare: -64735.000000
Generation 145 has ChiSquare: -64735.000000
Generation 146 has ChiSquare: -64735.000000
Generation 147 has ChiSquare: -64735.000000
Generation 148 has ChiSquare: -64735.000000
Generation 149 has ChiSquare: -64735.000000
Generation 150 has ChiSquare: -64735.000000
Generation 150 has fit parameters:
 [289.0, 483.0, 54.0, 293.0, 184.0, 263.0, 95.0, 599.0, 0.0, 29.0]
Generation 151 has ChiSquare: -64735.000000
Generation 152 has ChiSquare: -64735.000000
Generation 153 has ChiSquare: -64735.000000
Generation 154 has ChiSquare: -64735.000000
Generation 155 has ChiSquare: -64735.000000
Generation 156 has ChiSquare: -64735.000000
Generation 157 has ChiSquare: -64735.000000
Generation 158 has ChiSquare: -64735.000000
Generation 159 has ChiSquare: -64735.000000
Generation 160 has ChiSquare: -64735.000000
Generation 160 has fit parameters:
 [289.0, 483.0, 54.0, 293.0, 184.0, 263.0, 95.0, 599.0, 0.0, 29.0]
Generation 161 has ChiSquare: -64735.000000
Generation 162 has ChiSquare: -64897.000000
Generation 163 has ChiSquare: -65223.000000
Generation 164 has ChiSquare: -65223.000000
Generation 165 has ChiSquare: -65223.000000
Generation 166 has ChiSquare: -65223.000000
Generation 167 has ChiSquare: -65223.000000
Generation 168 has ChiSquare: -65223.000000
Generation 169 has ChiSquare: -65223.000000
Generation 170 has ChiSquare: -65223.000000
Generation 170 has fit parameters:
 [295.0, 498.0, 67.0, 299.0, 190.0, 212.0, 95.0, 577.0, 47.0, 49.0]
Generation 171 has ChiSquare: -65223.000000
Generation 172 has ChiSquare: -65223.000000
Generation 173 has ChiSquare: -65223.000000
Generation 174 has ChiSquare: -65223.000000
Generation 175 has ChiSquare: -65223.000000
Generation 176 has ChiSquare: -65223.000000
Generation 177 has ChiSquare: -65223.000000
Generation 178 has ChiSquare: -65223.000000
Generation 179 has ChiSquare: -65223.000000
Generation 180 has ChiSquare: -65223.000000
Generation 180 has fit parameters:
 [295.0, 498.0, 67.0, 299.0, 190.0, 212.0, 95.0, 577.0, 47.0, 49.0]
Generation 181 has ChiSquare: -65223.000000
Generation 182 has ChiSquare: -65223.000000
Generation 183 has ChiSquare: -65223.000000
Generation 184 has ChiSquare: -65223.000000
Generation 185 has ChiSquare: -65223.000000
Generation 186 has ChiSquare: -65223.000000
Generation 187 has ChiSquare: -65223.000000
Generation 188 has ChiSquare: -65223.000000
Generation 189 has ChiSquare: -65223.000000
Generation 190 has ChiSquare: -65223.000000
Generation 190 has fit parameters:
 [295.0, 498.0, 67.0, 299.0, 190.0, 212.0, 95.0, 577.0, 47.0, 49.0]
Generation 191 has ChiSquare: -65223.000000
Generation 192 has ChiSquare: -65223.000000
Generation 193 has ChiSquare: -65223.000000
Generation 194 has ChiSquare: -65223.000000
Generation 195 has ChiSquare: -65223.000000
Generation 196 has ChiSquare: -65223.000000
Generation 197 has ChiSquare: -65223.000000
Generation 198 has ChiSquare: -65223.000000
Generation 199 has ChiSquare: -65223.000000
Generation 200 has ChiSquare: -65223.000000
Generation 200 has fit parameters:
 [295.0, 498.0, 67.0, 299.0, 190.0, 212.0, 95.0, 577.0, 47.0, 49.0]
Generation 201 has ChiSquare: -65340.000000
Generation 202 has ChiSquare: -65340.000000
Generation 203 has ChiSquare: -65340.000000
Generation 204 has ChiSquare: -65340.000000
Generation 205 has ChiSquare: -65340.000000
Generation 206 has ChiSquare: -65340.000000
Generation 207 has ChiSquare: -65340.000000
Generation 208 has ChiSquare: -65340.000000
Generation 209 has ChiSquare: -65340.000000
Generation 210 has ChiSquare: -65340.000000
Generation 210 has fit parameters:
 [298.0, 500.0, 36.0, 297.0, 199.0, 176.0, 94.0, 583.0, 87.0, 50.0]
Generation 211 has ChiSquare: -65340.000000
Generation 212 has ChiSquare: -65340.000000
Generation 213 has ChiSquare: -65340.000000
Generation 214 has ChiSquare: -65340.000000
Generation 215 has ChiSquare: -65340.000000
Generation 216 has ChiSquare: -65340.000000
Generation 217 has ChiSquare: -65340.000000
Generation 218 has ChiSquare: -65340.000000
Generation 219 has ChiSquare: -65340.000000
Generation 220 has ChiSquare: -65340.000000
Generation 220 has fit parameters:
 [298.0, 500.0, 36.0, 297.0, 199.0, 176.0, 94.0, 583.0, 87.0, 50.0]
Generation 221 has ChiSquare: -65340.000000
Generation 222 has ChiSquare: -65340.000000
Generation 223 has ChiSquare: -65340.000000
Generation 224 has ChiSquare: -65340.000000
Generation 225 has ChiSquare: -65340.000000
Generation 226 has ChiSquare: -65340.000000
Generation 227 has ChiSquare: -65340.000000
Generation 228 has ChiSquare: -65340.000000
Generation 229 has ChiSquare: -65449.000000
Generation 230 has ChiSquare: -65449.000000
Generation 230 has fit parameters:
 [285.0, 498.0, 18.0, 296.0, 199.0, 181.0, 96.0, 596.0, 83.0, 49.0]
Generation 231 has ChiSquare: -65449.000000
Generation 232 has ChiSquare: -65449.000000
Generation 233 has ChiSquare: -65449.000000
Generation 234 has ChiSquare: -65449.000000
Generation 235 has ChiSquare: -65449.000000
Generation 236 has ChiSquare: -65449.000000
Generation 237 has ChiSquare: -65449.000000
Generation 238 has ChiSquare: -65449.000000
Generation 239 has ChiSquare: -65449.000000
Generation 240 has ChiSquare: -65449.000000
Generation 240 has fit parameters:
 [285.0, 498.0, 18.0, 296.0, 199.0, 181.0, 96.0, 596.0, 83.0, 49.0]
Generation 241 has ChiSquare: -65449.000000
Generation 242 has ChiSquare: -65449.000000
Generation 243 has ChiSquare: -65449.000000
Generation 244 has ChiSquare: -65449.000000
Generation 245 has ChiSquare: -65449.000000
Generation 246 has ChiSquare: -65449.000000
Generation 247 has ChiSquare: -65456.000000
Generation 248 has ChiSquare: -65456.000000
Generation 249 has ChiSquare: -65456.000000
Generation 250 has ChiSquare: -65456.000000
Generation 250 has fit parameters:
 [297.0, 498.0, 76.0, 300.0, 192.0, 195.0, 97.0, 588.0, 57.0, 47.0]
Generation 251 has ChiSquare: -65456.000000
Generation 252 has ChiSquare: -65456.000000
Generation 253 has ChiSquare: -65456.000000
Generation 254 has ChiSquare: -65622.000000
Generation 255 has ChiSquare: -65622.000000
Generation 256 has ChiSquare: -65622.000000
Generation 257 has ChiSquare: -65622.000000
Generation 258 has ChiSquare: -65622.000000
Generation 259 has ChiSquare: -65622.000000
Generation 260 has ChiSquare: -65622.000000
Generation 260 has fit parameters:
 [294.0, 496.0, 138.0, 300.0, 199.0, 221.0, 99.0, 590.0, 4.0, 48.0]
Generation 261 has ChiSquare: -65622.000000
Generation 262 has ChiSquare: -65622.000000
Generation 263 has ChiSquare: -65622.000000
Generation 264 has ChiSquare: -65622.000000
Generation 265 has ChiSquare: -65622.000000
Generation 266 has ChiSquare: -65622.000000
Generation 267 has ChiSquare: -65622.000000
Generation 268 has ChiSquare: -65622.000000
Generation 269 has ChiSquare: -65622.000000
Generation 270 has ChiSquare: -65622.000000
Generation 270 has fit parameters:
 [294.0, 496.0, 138.0, 300.0, 199.0, 221.0, 99.0, 590.0, 4.0, 48.0]
Generation 271 has ChiSquare: -65622.000000
Generation 272 has ChiSquare: -65622.000000
Generation 273 has ChiSquare: -65622.000000
Generation 274 has ChiSquare: -65622.000000
Generation 275 has ChiSquare: -65622.000000
Generation 276 has ChiSquare: -65622.000000
Generation 277 has ChiSquare: -65622.000000
Generation 278 has ChiSquare: -65622.000000
Generation 279 has ChiSquare: -65622.000000
Generation 280 has ChiSquare: -65622.000000
Generation 280 has fit parameters:
 [294.0, 496.0, 138.0, 300.0, 199.0, 221.0, 99.0, 590.0, 4.0, 48.0]
Generation 281 has ChiSquare: -65622.000000
Generation 282 has ChiSquare: -65622.000000
Generation 283 has ChiSquare: -65622.000000
Generation 284 has ChiSquare: -65622.000000
Generation 285 has ChiSquare: -65622.000000
Generation 286 has ChiSquare: -65622.000000
Generation 287 has ChiSquare: -65622.000000
Generation 288 has ChiSquare: -65622.000000
Generation 289 has ChiSquare: -65622.000000
Generation 290 has ChiSquare: -65622.000000
Generation 290 has fit parameters:
 [294.0, 496.0, 138.0, 300.0, 199.0, 221.0, 99.0, 590.0, 4.0, 48.0]
Generation 291 has ChiSquare: -65644.000000
Generation 292 has ChiSquare: -65644.000000
Generation 293 has ChiSquare: -65691.000000
Generation 294 has ChiSquare: -65691.000000
Generation 295 has ChiSquare: -65691.000000
Generation 296 has ChiSquare: -65691.000000
Generation 297 has ChiSquare: -65691.000000
Generation 298 has ChiSquare: -65691.000000
Generation 299 has ChiSquare: -65691.000000
Generation 300 has ChiSquare: -65691.000000
Generation 300 has fit parameters:
 [298.0, 500.0, 148.0, 300.0, 194.0, 206.0, 96.0, 600.0, 15.0, 46.0]
Generation 301 has ChiSquare: -65691.000000
Generation 302 has ChiSquare: -65691.000000
Generation 303 has ChiSquare: -65703.000000
Generation 304 has ChiSquare: -65703.000000
Generation 305 has ChiSquare: -65703.000000
Generation 306 has ChiSquare: -65703.000000
Generation 307 has ChiSquare: -65703.000000
Generation 308 has ChiSquare: -65703.000000
Generation 309 has ChiSquare: -65703.000000
Generation 310 has ChiSquare: -65703.000000
Generation 310 has fit parameters:
 [296.0, 497.0, 38.0, 299.0, 198.0, 232.0, 96.0, 599.0, 17.0, 48.0]
Generation 311 has ChiSquare: -65703.000000
Generation 312 has ChiSquare: -65703.000000
Generation 313 has ChiSquare: -65703.000000
Generation 314 has ChiSquare: -65703.000000
Generation 315 has ChiSquare: -65703.000000
Generation 316 has ChiSquare: -65703.000000
Generation 317 has ChiSquare: -65773.000000
Generation 318 has ChiSquare: -65773.000000
Generation 319 has ChiSquare: -65773.000000
Generation 320 has ChiSquare: -65773.000000
Generation 320 has fit parameters:
 [294.0, 499.0, 35.0, 299.0, 200.0, 244.0, 96.0, 600.0, 0.0, 50.0]
Generation 321 has ChiSquare: -65773.000000
Generation 322 has ChiSquare: -65773.000000
Generation 323 has ChiSquare: -65773.000000
Generation 324 has ChiSquare: -65773.000000
Generation 325 has ChiSquare: -65773.000000
Generation 326 has ChiSquare: -65773.000000
Generation 327 has ChiSquare: -65773.000000
Generation 328 has ChiSquare: -65773.000000
Generation 329 has ChiSquare: -65773.000000
Generation 330 has ChiSquare: -65773.000000
Generation 330 has fit parameters:
 [294.0, 499.0, 35.0, 299.0, 200.0, 244.0, 96.0, 600.0, 0.0, 50.0]
Generation 331 has ChiSquare: -65773.000000
Generation 332 has ChiSquare: -65773.000000
Generation 333 has ChiSquare: -65773.000000
Generation 334 has ChiSquare: -65773.000000
Generation 335 has ChiSquare: -65773.000000
Generation 336 has ChiSquare: -65773.000000
Generation 337 has ChiSquare: -65773.000000
Generation 338 has ChiSquare: -65774.000000
Generation 339 has ChiSquare: -65774.000000
Generation 340 has ChiSquare: -65774.000000
Generation 340 has fit parameters:
 [290.0, 500.0, 49.0, 298.0, 199.0, 243.0, 99.0, 596.0, 6.0, 46.0]
Generation 341 has ChiSquare: -65774.000000
Generation 342 has ChiSquare: -65774.000000
Generation 343 has ChiSquare: -65774.000000
Generation 344 has ChiSquare: -65774.000000
Generation 345 has ChiSquare: -65774.000000
Generation 346 has ChiSquare: -65774.000000
Generation 347 has ChiSquare: -65774.000000
Generation 348 has ChiSquare: -65774.000000
Generation 349 has ChiSquare: -65774.000000
Generation 350 has ChiSquare: -65774.000000
Generation 350 has fit parameters:
 [290.0, 500.0, 49.0, 298.0, 199.0, 243.0, 99.0, 596.0, 6.0, 46.0]
Generation 351 has ChiSquare: -65774.000000
Generation 352 has ChiSquare: -65774.000000
Generation 353 has ChiSquare: -65774.000000
Generation 354 has ChiSquare: -65779.000000
Generation 355 has ChiSquare: -65779.000000
Generation 356 has ChiSquare: -65779.000000
Generation 357 has ChiSquare: -65779.000000
Generation 358 has ChiSquare: -65779.000000
Generation 359 has ChiSquare: -65779.000000
Generation 360 has ChiSquare: -65779.000000
Generation 360 has fit parameters:
 [299.0, 500.0, 87.0, 300.0, 198.0, 229.0, 97.0, 596.0, 12.0, 43.0]
Generation 361 has ChiSquare: -65888.000000
Generation 362 has ChiSquare: -65888.000000
Generation 363 has ChiSquare: -65888.000000
Generation 364 has ChiSquare: -65888.000000
Generation 365 has ChiSquare: -65888.000000
Generation 366 has ChiSquare: -65888.000000
Generation 367 has ChiSquare: -65895.000000
Generation 368 has ChiSquare: -65895.000000
Generation 369 has ChiSquare: -65895.000000
Generation 370 has ChiSquare: -65895.000000
Generation 370 has fit parameters:
 [300.0, 500.0, 50.0, 300.0, 198.0, 231.0, 99.0, 599.0, 12.0, 49.0]
Generation 371 has ChiSquare: -65895.000000
Generation 372 has ChiSquare: -65895.000000
Generation 373 has ChiSquare: -65895.000000
Generation 374 has ChiSquare: -65895.000000
Generation 375 has ChiSquare: -65895.000000
Generation 376 has ChiSquare: -65895.000000
Generation 377 has ChiSquare: -65895.000000
Generation 378 has ChiSquare: -65895.000000
Generation 379 has ChiSquare: -65895.000000
Generation 380 has ChiSquare: -65895.000000
Generation 380 has fit parameters:
 [300.0, 500.0, 50.0, 300.0, 198.0, 231.0, 99.0, 599.0, 12.0, 49.0]
Generation 381 has ChiSquare: -65895.000000
Generation 382 has ChiSquare: -65895.000000
Generation 383 has ChiSquare: -65895.000000
Generation 384 has ChiSquare: -65895.000000
Generation 385 has ChiSquare: -65895.000000
Generation 386 has ChiSquare: -65895.000000
Generation 387 has ChiSquare: -65895.000000
Generation 388 has ChiSquare: -65895.000000
Generation 389 has ChiSquare: -65895.000000
Generation 390 has ChiSquare: -65895.000000
Generation 390 has fit parameters:
 [300.0, 500.0, 50.0, 300.0, 198.0, 231.0, 99.0, 599.0, 12.0, 49.0]
Generation 391 has ChiSquare: -65895.000000
Generation 392 has ChiSquare: -65895.000000
Generation 393 has ChiSquare: -65895.000000
Generation 394 has ChiSquare: -65895.000000
Generation 395 has ChiSquare: -65895.000000
Generation 396 has ChiSquare: -65966.000000
Generation 397 has ChiSquare: -65966.000000
Generation 398 has ChiSquare: -65966.000000
Generation 399 has ChiSquare: -65966.000000
Generation 400 has ChiSquare: -65966.000000
Generation 400 has fit parameters:
 [299.0, 499.0, 21.0, 299.0, 200.0, 249.0, 100.0, 597.0, 2.0, 50.0]
Generation 401 has ChiSquare: -65966.000000
Generation 402 has ChiSquare: -65966.000000
Generation 403 has ChiSquare: -65966.000000
Generation 404 has ChiSquare: -65966.000000
Generation 405 has ChiSquare: -65966.000000
Generation 406 has ChiSquare: -65966.000000
Generation 407 has ChiSquare: -65966.000000
Generation 408 has ChiSquare: -65966.000000
Generation 409 has ChiSquare: -65966.000000
Generation 410 has ChiSquare: -65966.000000
Generation 410 has fit parameters:
 [299.0, 499.0, 21.0, 299.0, 200.0, 249.0, 100.0, 597.0, 2.0, 50.0]
Generation 411 has ChiSquare: -65966.000000
Generation 412 has ChiSquare: -65966.000000
Generation 413 has ChiSquare: -65966.000000
Generation 414 has ChiSquare: -65966.000000
Generation 415 has ChiSquare: -65966.000000
Generation 416 has ChiSquare: -65966.000000
Generation 417 has ChiSquare: -65966.000000
Generation 418 has ChiSquare: -65966.000000
Generation 419 has ChiSquare: -65966.000000
Generation 420 has ChiSquare: -65966.000000
Generation 420 has fit parameters:
 [299.0, 499.0, 21.0, 299.0, 200.0, 249.0, 100.0, 597.0, 2.0, 50.0]
Generation 421 has ChiSquare: -65966.000000
Generation 422 has ChiSquare: -65966.000000
Generation 423 has ChiSquare: -65966.000000
Generation 424 has ChiSquare: -65966.000000
Generation 425 has ChiSquare: -65966.000000
Generation 426 has ChiSquare: -65966.000000
Generation 427 has ChiSquare: -65966.000000
Generation 428 has ChiSquare: -65966.000000
Generation 429 has ChiSquare: -65966.000000
Generation 430 has ChiSquare: -65966.000000
Generation 430 has fit parameters:
 [299.0, 499.0, 21.0, 299.0, 200.0, 249.0, 100.0, 597.0, 2.0, 50.0]
Generation 431 has ChiSquare: -65966.000000
Generation 432 has ChiSquare: -65966.000000
Generation 433 has ChiSquare: -65966.000000
Generation 434 has ChiSquare: -65966.000000
Generation 435 has ChiSquare: -65966.000000
Generation 436 has ChiSquare: -65966.000000
Generation 437 has ChiSquare: -65966.000000
Generation 438 has ChiSquare: -65966.000000
Generation 439 has ChiSquare: -65966.000000
Generation 440 has ChiSquare: -65966.000000
Generation 440 has fit parameters:
 [299.0, 499.0, 21.0, 299.0, 200.0, 249.0, 100.0, 597.0, 2.0, 50.0]
Generation 441 has ChiSquare: -65966.000000
Generation 442 has ChiSquare: -65966.000000
Generation 443 has ChiSquare: -65966.000000
Generation 444 has ChiSquare: -65966.000000
Generation 445 has ChiSquare: -65966.000000
STOP("ChangeOverGeneration with {'tolerance': 0.0001, 'generations': 50}")
Optimization terminated successfully.
         Current function value: -65966.000000
         Iterations: 445
         Function evaluations: 178400

max profit: 65966.0
load: 74991.0 <= 75000
spend: 317337.0 <= 350000

item 0: 299.0
item 1: 499.0
item 2: 21.0
item 3: 299.0
item 4: 200.0
item 5: 249.0
item 6: 100.0
item 7: 597.0
item 8: 2.0
item 9: 50.0
 

Это моя первая попытка получить решение, и решатель не настроен, вы можете видеть, что, вероятно, все еще есть небольшая возможность для улучшения, так как сходимость в конце резкая, а не очень гладкая-однако я предполагаю, что решение близко к оптимальному (на основе проверки ограничений). Я бы немного поиграл с настройками и тем, как накладываются ограничения/штрафы, чтобы посмотреть, может ли решение немного улучшиться.