Question 9:10 pointsIn this question, we will create a prediction from boosted trees using: ( 0)= (? =1 ( 0)).fboost(x0)=sign(?t=1Tatft(X0)).Define a function called predict that accepts two inputs:a 2-d numpy array of x-obervationsa dictionary that contains classifiers and alphasYour function should combine the models in the manner described in the equation above to create predictions for the observations.Your function should return a 1D numpy array of observations (all 0s and 1s).def predict(X, est_dict): “”” Create a np.array list of predictions for all of the observations in x, according to the above equation. Positional Arguments: X — a 2-d numpy array of X observations. Features in columns, observations in rows. est_dict — a dictionary consists of keys 0 through n with tuples as values The tuples will be (, alpha), where alpha is a float, and is a sklearn DecisionTreeClassifier Example: ### Our example dataset, inspired from lecture pts = [[.5, 3,1],[1,2,1],[3,.5,0],[2,3,0],[3,4,1], [3.5,2.5,0],[3.6,4.7,1],[4,4.2,1],[4.5,2,0],[4.7,4.5,0]] df = pd.DataFrame(pts, columns = [‘x’,’y’,’classification’]) ### split out X and labels X = df[[‘x’,’y’]] y = df[‘classification’] ### Split data in half X1 = X.iloc[:len(X.index)//2, :] X2 = X.iloc[len(X.index)//2:, :] y1 = y[:len(y)//2] y2 = y[len(X)//2:] ### Fit classifiers to both sets of data, save to dictionary: ### Tree-creator helper function def simple_tree(): return DecisionTreeClassifier(criterion = ‘entropy’, max_depth= 1) tree_dict = {} tree1 = simple_tree() tree1.fit(X1,y1) print(“threshold:”, tree1.tree_.threshold[0], “feature:”, tree1.tree_.feature[0]) ### made up alpha, for example alpha1 = .6 tree_dict[1] = (tree1, alpha1) tree2 = simple_tree() tree2.fit(X2,y2) print(“threshold:”, tree2.tree_.threshold[0], “feature:” ,tree2.tree_.feature[0]) ### made up alpha, again. alpha2 = .35 tree_dict[2] = (tree2, alpha2) print(predict(X, tree_dict)) #–> np.array([1., 1., 0., 0., 0., 0., 0., 0., 0., 0.]) ############################### ### For Further Checking of your function: ### The sum of predictions from the two models should be: # If tree2 splits on feature 1: # np.array([ 0.25 0.25 -0.95 -0.95 -0.25 -0.95 -0.25 -0.25 -0.95 -0.25]) # If tree2 splits on feature 0: # np.array([ 0.95 0.95 -0.25 -0.25 -0.25 -0.25 -0.25 -0.25 -0.95 -0.95]) ############################### Assumptions: The models in the `est-dict` tuple will return 0s and 1s. HOWEVER, the prediction equation depends upon predictions of -1s and 1s. FINALLY, the returned predictions should be 0s and 1s. “””Computer ScienceEngineering & TechnologyPython Programming COMPUTER S 4771