Python programmer here. I know just the basics about C but I was given the task of taking n entries in an array then creating a function that takes an array pointer as its argument, and then accessing the elements of the array that the pointer is pointing at.
A simplified version of my initial code was as follows:
#include<stdio.h>
void func(int *pArr, int size){
for (int j = 0; j < size; j++){
printf("%d ",*(pArr+j));
}
}
int main(){
int n;
printf("Enter the number of elements you want in your list: ");
scanf("%d", &n);
int a[n];
for (int i = 1; i <= n; i++){
printf("Enter element no.%d: ", i);
scanf("%d", &a[i]);
}
func(&a[0], n);
return 0;
}
Here my thought process was, since pArr is pointing at the 0th index of a, *(pArr+j) when j = 0 should just return *pArr which should just be the 0th index of a.
But to my surprise, the following is the output I got for a test run where a[5] = {1, 3, 9, 8, 2}
863014944 1 3 8 9
The program prints out a garbage value for *(pArr+j) when j = 0 but works fine after that. So I modified the code by initializing "j" with a value of 1 instead of 0 and changing the "<" to "<=" and surprisingly the program did run perfectly fine after those changes.
This is what puzzles me.
If I'm declaring a pointer *pArr pointing at the 0th index of an array a, i.e., *pArr = &a[0]; then shouldn't printing *(pArr+1) return the 1st index? Why is it that my program returns the 0th index instead? Why is *(pArr+0) not equal to a[0] when pArr is pointing at a[0]?
func
is correct, but you didn't populatea
correctly.Think of array indexes as offsets, which is to say how far they are from the start the array.
0
.1
.n
elements isn-1
.So this is what you are building:
a[0]
.Replace
with