I have a linked service. Name is 'LS_test'. The name of the linked service is the same in development as in production. This linked service makes a connection from a Synapse notebook to the data lake storage account. Since the storage account name is different in development and production I tried to make parameter in the linked service:

How can I fill in the StorageAccountName with a different value depending on the environment (development vs production) the notebook is running, say 'Storage-dev' vs 'Storage-prd'?
Maybe with a configuration file of the spark pool?
According to this, PySpark notebooks do not support parameterized linked services. If you attempt to connect ADLS through a linked service, you may encounter the following error:
The error message states that "Linked Services using parameters are not supported yet." This may be a feature request for Synapse notebooks. Instead, create parameters by manually entering the storage account name with managed identity authentication in the storage account linked services for both environments, as shown below:
Use these linked services according to the specific environments.