Why xgboost.cv and sklearn.model_selection.validation_curve have large time-consuming difference?

NOTE: I have seen this similar question, but it does not solve my problem.

XGBoost has its native cross validation function xgboost.cv. But I want to use the sklearn interface, which is more easy to interact with the sklearn package. So I use the cross validation function sklearn.model_selection.validation_curve. Both functions can return the same train_mean_mae and test_mean_mae, but they have large time-consuming difference. The form (xgboost.cv) costs a few second, but the latter (validation_curve) costs about one minute!

I cannot figure out whether there are some difference when xgboost and sklearn implement the cross validation function. Or did I make some mistakes?

The code is wrote in Jupyter notebook.

import numpy as np
import xgboost as xgb

from sklearn.datasets import make_regression
from sklearn.datasets import make_classification
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import cross_validate
from sklearn.model_selection import KFold
from sklearn.model_selection import validation_curve

X, y = make_regression(n_samples=10000, n_features=10)

n_estimators = 50
params = {'n_estimators':n_estimators, 'booster':'gbtree', 'max_depth':5, 'learning_rate':0.05,
          'objective':'reg:squarederror', 'subsample':1, 'colsample_bytree':1}
clf = xgb.XGBRegressor(**params)

cv = KFold(n_splits=5, shuffle=True, random_state=100)
%%time

estimator_range = range(1, n_estimators+1)
train_score, test_score = validation_curve(
    clf, X, y, param_name='n_estimators', param_range=estimator_range,
    cv=cv, scoring='neg_mean_absolute_error'
)
print('train_mae_mean:\n', np.abs(train_score).mean(axis=1))
print('test_mae_mean:\n', np.abs(test_score).mean(axis=1))

validation_curve costs 57s.

%%time

params_xgb = params.copy() # 修改参数
num_round = params_xgb['n_estimators']
params_xgb['eta'] = params['learning_rate']
del params_xgb['n_estimators']
del params_xgb['learning_rate']

# xgboost原生接口 进行交叉验证
res = xgb.cv(params_xgb, xgb.DMatrix(X, y), num_round, folds=cv, metrics='mae')
print(res)

xgboost.cv costs 2.25s.