Runtime Error while transfer learning a model using learn.fit_one_cycle(32)

68 Views Asked by At

Runtime Error while transfer learning a model on Pycharm in line while using learn.fit_one_cycle(32).The code is also running twice on its own. firstly it executes till last line,then again everything executes(output is shown twice) and then this runtime error occours.Also the code is still executing in pycharm IDE i.e i have to stop it explicitly. no output is shown but IDE shows it is still running.This is my first time on Python programming and model training. I have run the same code on google collab and it shows no error but since transfer learning takes alot of time collab disconnects so i turned to pycharm which is again new to me. I have pasted the same code on pycharm and without creating a class i am executing it. all the code is running fine unitl the last line i.e. learn.fit_one_cycle(32). I am using fastai and pytorch and transfer learning efficienet B5.Basically trying to learn by running the code i found on github for same. I have not created any class or init main,if that helps. I have also included print statements in between to check the flow ,since i am new to pycharm,python,DL, so i did not go into debugging option(totally aware of it,too short in time.)i did add the if name=main but the code does not run after learn.fitone cycle and if i remove it then itgives the above runtime error.

CODE

    model_name = 'efficientnet-b5'
    approach_name = 'preproc_thres'
    use_external_data = True
    use_weighted_loss = False
    filename = "{}_{}".format(model_name, approach_name)
    from fastai.vision import *
    from fastai.vision.learner import model_meta
from fastai.metrics import error_rate

import pandas as pd
import seaborn as sns

batch_size = 4

isic_path ='/ISIC_2019_Training_Input'
def get_isic_df():
    df = pd.read_csv('D:/ISIC_2019_Groundtruth_copy2.csv')
    path_img = 'ISIC_2019_Training_Input'

    for label in df.columns[1:]:
        df.loc[df[label] == 1.0, 'label'] = label

    df.rename(columns={'image': 'name'}, inplace=True)
    df['name'] = df['name'].apply(lambda x: "{}/{}.jpg".format(isic_path,x))
    df = df[['name', 'label']]
    return df

data_df=get_isic_df()

print(data_df)

xtra_tfms = (cutout(n_holes=(1,1), length=(16,16), p=.5))
tfms = get_transforms(max_rotate=45,
                      p_affine=0.5,
                      p_lighting=0.5,
                      do_flip=True,
                      flip_vert=True,
                      max_zoom=1.05,
                      max_warp=None,
                      max_lighting=0.2,
                      )

data3 = ImageDataBunch.from_df(path='D:/ISIC_2019_Training_Input', df=get_isic_df(), ds_tfms=tfms, size=456,
                              resize_method=ResizeMethod.PAD, bs=batch_size,
                              valid_pct=0.1)

data3.show_batch()
print(len(data3.classes))
from efficientnet_pytorch import EfficientNet
def efficient_net_b5(pretrained=True):
    model = EfficientNet.from_pretrained('efficientnet-b5')
    return nn.Sequential(model)

from fastai.vision.learner import model_meta
model_meta[efficient_net_b5] =  { 'cut': noop,
                               'split': lambda m: (list(m[0][0].children())[2][19], m[1]) }

output_size = 1000
custom_head = nn.Linear(output_size, data3.c)
learn = cnn_learner(data3, efficient_net_b5, custom_head = custom_head)

data_df['label'].value_counts()

total_images = data_df.shape[0]
print(total_images)

print(data3.classes)

weights_loss = []


for c in data3.classes:
    samples = data_df['label'].value_counts()[c]
    weights_loss.append(1 / (samples / total_images))

normalized_weights = [x / sum(weights_loss) for x in weights_loss]
print(normalized_weights)
if use_weighted_loss:
    from torch import nn

    class_weights = torch.FloatTensor(normalized_weights).cuda()
    learn.crit = nn.CrossEntropyLoss(weight=class_weights)

print('ok')

learn.fit_one_cycle(32)

ERROR

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\spawn.py", line 116, in spawn_main
    exitcode = _main(fd, parent_sentinel)
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\spawn.py", line 125, in _main
    prepare(preparation_data)
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\spawn.py", line 236, in prepare
    _fixup_main_from_path(data['init_main_from_path'])
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\spawn.py", line 287, in _fixup_main_from_path
    main_content = runpy.run_path(main_path,
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\runpy.py", line 289, in run_path
    return _run_module_code(code, init_globals, run_name,
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\runpy.py", line 96, in _run_module_code
    _run_code(code, mod_globals, init_globals,
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "C:\Users\PycharmProjects\pythonProject\Sample1.py", line 102, in <module>
    learn.fit_one_cycle(32)
  File "C:\Users\anuja\PycharmProjects\pythonProject\venv\lib\site-packages\fastai\train.py", line 23, in fit_one_cycle
    learn.fit(cyc_len, max_lr, wd=wd, callbacks=callbacks)
  File "C:\Users\PycharmProjects\pythonProject\venv\lib\site-packages\fastai\basic_train.py", line 200, in fit
    fit(epochs, self, metrics=self.metrics, callbacks=self.callbacks+callbacks)
  File "C:\Users\PycharmProjects\pythonProject\venv\lib\site-packages\fastai\basic_train.py", line 99, in fit
    for xb,yb in progress_bar(learn.data.train_dl, parent=pbar):
  File "C:\Users\PycharmProjects\pythonProject\venv\lib\site-packages\fastprogress\fastprogress.py", line 50, in __iter__
    raise e
  File "C:\Users\PycharmProjects\pythonProject\venv\lib\site-packages\fastprogress\fastprogress.py", line 41, in __iter__
    for i,o in enumerate(self.gen):
  File "C:\Users\PycharmProjects\pythonProject\venv\lib\site-packages\fastai\basic_data.py", line 75, in __iter__
    for b in self.dl: yield self.proc_batch(b)
  File "C:\Users\PycharmProjects\pythonProject\venv\lib\site-packages\torch\utils\data\dataloader.py", line 441, in __iter__
    return self._get_iterator()
  File "C:\Users\PycharmProjects\pythonProject\venv\lib\site-packages\torch\utils\data\dataloader.py", line 388, in _get_iterator
    return _MultiProcessingDataLoaderIter(self)
  File "C:\Users\PycharmProjects\pythonProject\venv\lib\site-packages\torch\utils\data\dataloader.py", line 1042, in __init__
    w.start()
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\process.py", line 121, in start
    self._popen = self._Popen(self)
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\context.py", line 224, in _Popen
    return _default_context.get_context().Process._Popen(process_obj)
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\context.py", line 336, in _Popen
    return Popen(process_obj)
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\popen_spawn_win32.py", line 45, in __init__
    prep_data = spawn.get_preparation_data(process_obj._name)
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\spawn.py", line 154, in get_preparation_data
    _check_not_importing_main()
  File "C:\Program Files\WindowsApps\PythonSoftwareFoundation.Python.3.10_3.10.3056.0_x64__qbz5n2kfra8p0\lib\multiprocessing\spawn.py", line 134, in _check_not_importing_main
    raise RuntimeError('''
RuntimeError: 
        An attempt has been made to start a new process before the
        current process has finished its bootstrapping phase.

        This probably means that you are not using fork to start your
        child processes and you have forgotten to use the proper idiom
        in the main module:

            if __name__ == '__main__':
                freeze_support()
                ...

        The "freeze_support()" line can be omitted if the program
        is not going to be frozen to produce an executable.
0

There are 0 best solutions below