ValueError: Resulting object does not have monotonic global indexes along dimension lat when combining netcdf files

I posted this previous question and found a way to create a (lat,lon,time) as my dimensions. However once I started processing it, I started to get the ValueError: Resulting object does not have monotonic global indexes along dimension lat. I double checked the longitude and latitude values by using .sel() and they indexed the right value. I have multiple daily files of the dataset and I plan to loop and run this code for each file. Here's the code I used.

import xarray as xr

all_alt = xr.open_dataset('dataset.nc')
ssha = all_alt.ssha
ssha = ssha.set_index(time=('lon','lat'))
rechunked = ssha.chunk(chunks={'time':300}) #divided it into chunks for faster processing
unstack = rechunked.unstack('time')
unstack = unstack.to_dataset()

time1 = all_alt.resample(time='1D').mean() #Get the daily value of time and assign that to the whole dataset
timeval = time1['time'].values

unstack = unstack.expand_dims(time=timeval) #to add back the time dimension

This results to

<xarray.Dataset>
Dimensions:  (lon: 56, lat: 56, time: 1)
Coordinates:
  * time     (time) datetime64[ns] 2002-01-23
  * lon      (lon) float64 121.7 121.7 121.7 121.7 ... 122.8 122.8 122.8 122.8
  * lat      (lat) float64 13.03 13.08 13.13 13.18 ... 15.81 15.86 15.91 15.96
Data variables:
    ssha     (time, lon, lat) float32 dask.array<chunksize=(1, 56, 56), meta=np.ndarray>

And this is exactly the dataset I want. However, when I try to combine these files through xr.open_mfdataset(combine='by_coords'), I receive the ValueError: Resulting object does not have monotonic global indexes along dimension lat. Is there something wrong with the code? I tried searching around stackoverflow but I can't seem to find a solution.

Here's the link to one of my data

How many English words
do you know?
Test your English vocabulary size, and measure
how many words do you know
Online Test
Powered by Examplum