I would like to run python code in google colab that loops through each tile and extracts the reflectance values from the points within each tile to build a SVM model. Once the model is trained using all the point within each tile the code should be able to classify each tile one by one to create a classified map of my area on interest
`Code which i have so far :
# Import the necessary libraries.
import ee
!pip install geemap
import geemap
drive.mount('/content/drive')
# Load the shapefile representing the AOI
shapefile_path = '/content/drive/My Drive/CF/_Boarder.shp'
geometry = geemap.shp_to_ee(shapefile_path)
# Define the start and end dates for your image collection.
start_date = '2023-11-14'
end_date = '2023-11-27'
# Define the filter for cloud cover.
cloud_filter = ee.Filter.lt('CLOUD_COVERAGE_ASSESSMENT', 50)
# Load the Sentinel 2 imagery collection.
s2_collection = (ee.ImageCollection('COPERNICUS/S2_SR')
.filterDate(start_date, end_date)
.filterBounds(geometry)
.filter(cloud_filter)
.sort('system:time_start', True))
# Define the path to the shapefile
Sample_points_shapefile_path = '/content/drive/My Drive/CF/Raw_Data_23022024.shp'
# Load the shapefile as a GeoDataFrame
gdf = geemap.shp_to_gdf(Sample_points_shapefile_path)
# Convert the GeoDataFrame to an ee.FeatureCollection
sample_FC = geemap.gdf_to_ee(gdf)
# Visualize the first few rows of the GeoDataFrame
print(gdf.head())
Random_ID IAP_Yes_or Fynbos_typ Alien_taxa \
0 774 Yes Not fynbos Gum
1 554 Yes Scrub Fynbos Pine
2 135 No Scrub Fynbos None
3 729 Yes Scrub Fynbos Pine
4 771 Yes Not fynbos Pine
LULC__Fynb Latitude Longitude geometry
0 Gum -34.595878 19.568680 POINT (19.56868 -34.59588)
1 Pine -34.587340 19.552233 POINT (19.55223 -34.58734)
2 High Density Fynbos -34.587086 19.548819 POINT (19.54882 -34.58709)
3 Pine -34.585369 19.528607 POINT (19.52861 -34.58537)
4 Pine -34.593895 19.519623 POINT (19.51962 -34.59389)
**I want to use the field LULC__Fynb as classes for the classification **
# Get a list of unique Sentinel-2 tiles intersecting with your AOI
tile_collection = s2_collection.aggregate_array('MGRS_TILE').distinct().getInfo()
# Print the list of tiles
print("Sentinel-2 Tiles intersecting with the AOI:")
print(tile_collection)
Sentinel-2 Tiles intersecting with the AOI:
['34JCL', '34JBL', '33JYF', '34HCK', '34HBK', '33HYE', '34HBJ', '33HYD', '34HBH', '33HYC', '35HMD', '35HLD', '35HKD', '34HGJ', '35HMC', '34HFJ', '35HLC', '35HKC', '34HGH', '34HFH', '34HEJ', '34HDJ', '34HEH', '34HCG', '35HND', '35HNC', '34HDH', '34HEG', '34HDG', '34HCJ', '34HCH', '34HBG']
**Somthing along the lines of:**
# Define function to extract reflectance values from points within each tile
def extract_values(tile, geometry, collection):
# Filter collection by tile
tile_collection = collection.filter(ee.Filter.eq('MGRS_TILE', tile))
# Function to extract reflectance values for a single image
def extract_point_values(feature):
image = ee.Image(tile_collection.first()) # Get the first image in the collection
return feature.set('reflectance', image.reduceRegion(
reducer=ee.Reducer.first(),
geometry=feature.geometry(),
scale=10
))
return geometry.map(extract_point_values)
# Loop through each tile
for tile in tile_collection:
print("Processing tile:", tile)
# Extract reflectance values for points within the tile
points_with_reflectance = extract_values(tile, sample_FC, s2_collection)
# Convert feature collection to numpy array
points_array = fc_to_array(points_with_reflectance)
# Extract labels (LULC__Fynb)
labels = gdf['LULC__Fynb']
# Identify missing labels
missing_labels = set(labels.unique()) - set(label_mapping.keys())
# Update label mapping to include missing labels
for label in missing_labels:
label_mapping[label] = len(label_mapping)
# Convert labels to numerical values
labels_numeric = labels.map(lambda x: label_mapping[x])
`