plotting multivariate grouped bar graph using loop

593 Views Asked by At

I am trying to create a grouped bar chart with multiple subplots using matplotlib and pandas. I am able to create it manually defining the plots according to the values of the datatframe, but I want to get it automated with loops. I have tried many ways doing a loop, but running into one or other error every time. Being a beginner in both programming and python, I am getting lost. here's my data:sales3

The code I have written to get the expected output:

sales3 = sales.groupby(["Region","Tier"])[["Sales2015","Sales2016"]].sum().round().astype("int64")
sales3.reset_index(inplace=True)
fig,(ax1,ax2,ax3) = plt.subplots(nrows=1,ncols=3,sharex=True,sharey=True,figsize=(10,6))
sales3[sales3["Region"]=="Central"].plot(kind="bar",x="Tier",y=["Sales2015","Sales2016"],ax=ax1)
ax1.set_title("Central")
sales3[sales3["Region"]=="East"].plot(kind="bar",x="Tier",y=["Sales2015","Sales2016"],ax=ax2)
ax2.set_title("East")
sales3[sales3["Region"]=="West"].plot(kind="bar",x="Tier",y=["Sales2015","Sales2016"],ax=ax3)
ax3.set_title("West")
plt.tight_layout()

output: expected output

Please guide how do I write it using a loop or any automated way. Say, I have another region like "North" /"South" added in future or a new Tier introduced, what will be the best way to program that would accommodate such new additions.

1

There are 1 best solutions below

0
On BEST ANSWER

You can iterate through the axes and regions:

sales3 = sales.groupby(["Region","Tier"])[["Sales2015","Sales2016"]].sum().round().astype("int64")
sales3.reset_index(inplace=True)
fig,axes = plt.subplots(nrows=1,ncols=3,sharex=True,sharey=True,figsize=(10,6))

# define regions to plot
regions = ["Central", "East", "West"]

# iterate over regions and axes using zip()
for region, ax in zip(regions,axes):
    sales3[sales3["Region"]==region].plot(kind="bar",x="Tier",y=["Sales2015","Sales2016"],ax=ax)
    ax.set_title(region)
plt.tight_layout()

I think the key is using pythons built-in zip function which is documented here.