We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
import xgboost as xgb xgb_params = { "objective" : "reg:absoluteerror", "eval_metric" :"mae", "seed" : 420, "booster" : "gbtree", "device" : "cuda", "eta" : 0.01, "gamma" : 5, "max_depth" : 128, "lambda" : 10, "alpha" : 2, "max_leaves": 256 } xgb.set_config( verbosity=0 ) cv = GroupShuffleSplit(n_splits=5,test_size=0.1) xgb_models_list = [] for train_ind,test_ind in cv.split( temp_train_df.drop('sales').to_numpy(), temp_train_df.get_column('sales').to_numpy(), groups=temp_train_df.get_column('warehouse').to_numpy() ): X_train = temp_train_df.drop('sales').with_row_index().filter(pl.col('index').is_in(train_ind)).to_numpy() X_test = temp_train_df.drop('sales').with_row_index().filter(pl.col('index').is_in(test_ind)).to_numpy() y_train = temp_train_df.select('sales').with_row_index().filter(pl.col('index').is_in(train_ind)).get_column('sales').to_numpy() y_test = temp_train_df.select('sales').with_row_index().filter(pl.col('index').is_in(test_ind)).get_column('sales').to_numpy() dtrain = xgb.DMatrix(X_train,label=y_train) dval = xgb.DMatrix(X_test,label=y_test) print(f"Training XGB model") xgb_model = xgb.train( xgb_params, dtrain, num_boost_round=4000, evals=[(dtrain,"Train"),(dval,"Valid")], callbacks=[ xgb.callback.EarlyStopping(rounds=200), xgb.callback.EvaluationMonitor(period=500,show_stdv=False,rank=0) ] )
Training XGB model [0] Train-mae:90.99591 Valid-mae:79.83470 [0] Train-mae:90.99591 Valid-mae:79.83470 [1] Train-mae:90.81372 Valid-mae:79.78599 [2] Train-mae:90.62106 Valid-mae:79.74804 [3] Train-mae:90.44182 Valid-mae:79.70253 [4] Train-mae:90.26843 Valid-mae:79.66690 [5] Train-mae:90.10026 Valid-mae:79.62726 [6] Train-mae:89.93129 Valid-mae:79.60968 [7] Train-mae:89.76939 Valid-mae:79.57884 [8] Train-mae:89.60516 Valid-mae:79.54222 [9] Train-mae:89.44614 Valid-mae:79.50914 [10] Train-mae:89.28421 Valid-mae:79.48071 [11] Train-mae:89.11796 Valid-mae:79.45545 [12] Train-mae:88.95689 Valid-mae:79.42221 [13] Train-mae:88.79400 Valid-mae:79.39508 [14] Train-mae:88.64471 Valid-mae:79.36411 [15] Train-mae:88.48538 Valid-mae:79.33596 [16] Train-mae:88.32891 Valid-mae:79.30990 [17] Train-mae:88.19119 Valid-mae:79.28522
I tried using xgbRegressor class it doesn't accept callback argument and xgb.cv gives cannot convert to string to float error.
Please help!
The text was updated successfully, but these errors were encountered:
Hi, if you are passing the EvaluationMonitor yourself, then you should set the verbose_eval=False for the train function.
EvaluationMonitor
verbose_eval=False
train
I tried using xgbRegressor class it doesn't accept callback argument
It does. In the constructor:
reg = xgb.XGBRegressor(callbacks=[MyCallback()]) # or reg.set_params(callbacks=[MyCallback()])
Sorry, something went wrong.
No branches or pull requests
I tried using xgbRegressor class it doesn't accept callback argument and xgb.cv gives cannot convert to string to float error.
Please help!
The text was updated successfully, but these errors were encountered: