ValueError: shapes (12,2) and (24,) not aligned: 2 (dim 1) != 24 (dim 0)

I need to minimize Loss based on two loss variables L1 & L2. So my objective function is

z=a*L1+(1-a)*L2,subject to constraint
L2-L1<=G1

I have around 1 million rows of records in csv format. Each row contains 1 value of L1,1 of L2 & 1 of G1 & 1 of a. Considering the data size I thought to use Trust region interior point optimization. To start with it initially I took a small data of 12 rows from my data set & wrote following code.

from scipy.optimize import LinearConstraint
from scipy.optimize import minimize
from scipy.optimize import SR1

df=pd.read_csv('subset.csv')

linear_constraint = LinearConstraint([[-1, 1], [-1, 1],[-1, 1],[-1, 1],[-1, 1],[-1, 1],[-1, 1],[-1, 1],[-1, 1],[-1, 1],[-1, 1],[-1, 1]], 
                                     [-np.inf,-np.inf,-np.inf,-np.inf,-np.inf,-np.inf,-np.inf,-np.inf,-np.inf,-np.inf,-np.inf,-np.inf],                                      [0.0313,0.7153,0.1341,0.338,0,0.0182,0.2188,0.3234,0.5558,0,0.1382,0.7191])

L1=df.iloc[:,19].values
L2=df.iloc[:,18].values
x=np.concatenate([L1,L2])
a=df.iloc[:,17]

def loss(x):
    return sum(a*x[0:12]+(1-a)*x[12:24])

def jacobian_const(x):

    L=len(x)
    Lo2=int(L/2)
    def row_i(l,N):
          No2 = int(N/2)
          col_i =[0.0]*N
          col_i[l-1] =-1
          col_i[l-1+No2] = 1
          return col_i
    J = row_i(1,L)
    Jacobian=[]
    Jacobian.append(J)
    type(J)
    for l in range(1,Lo2): 
        Jacobian.append(row_i(2,L))      
    return Jacobian

res = minimize(loss, x, method='trust-constr',jac=jacobian_const,hess=SR1(), 
                constraints=[linear_constraint],
                options={'verbose': 1})

After running this code I'm getting error message as

ValueError: shapes (12,2) and (24,) not aligned: 2 (dim 1) != 24 (dim 0)

Can you suggest me what changes I need to do to eliminate this error?Sorry, I can't able to share the data. It's client data