One of the external application is inserting more than 40K records in the SQL Azure table.
I have a trigger to process all of the required rows which matches the column value for the unique column value for the distinct record
- Whenever the 40K+ record in inserted the trigger is not able to fetch or trigger for all of the records and just getting 1 and 2 records sometimes. 
- In trigger how can I get distinct column value and order by. 
- inserting into temptables insert fewer columns only and random
How can i do batch processing from the trigger for the bulk insert
  /****** Object:  Trigger [dbo].[PriceStagingInsertTrigger]    Script Date: 
    29/09/2020 13:46:24 ******/
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    ALTER TRIGGER [dbo].[PriceStagingInsertTrigger] on [dbo]. 
    [SalesPriceStaging]
    AFTER INSERT 
    AS DECLARE @ItemNumber NVARCHAR(20),
           @applicablefromdate datetime,
           @partycodetype int
     SELECT @ItemNumber = ins.ITEMNUMBER FROM INSERTED ins;
     SELECT @applicablefromdate = ins.PRICEAPPLICABLEFROMDATE FROM INSERTED ins;
     SELECT @partycodetype = ins.PARTYCODETYPE FROM INSERTED ins;
     SELECT * into #temptables from inserted
     EXEC dbo.spSalesPriceStaging @ItemNumber,@applicablefromdate,@partycodetype
     PRINT 'Stored procedure spSalesPriceStaging executed and completed'
 
                        
The (BAD) solution is :
As they say, trigger in SQL Server have a set based logic and fires only one time even if there is 3657435435143213213 rows that are inserted.
The presence of a variable in the code is generally a pattern a bad code design....