Luegisdorf
2/10/2006 7:31:00 AM
That means every job has its own transaction but memory usage still grows
after a couple of jobs? Sounds very misterious; did you used the standard
'batch' function in Axapta to execute your import job or have you built your
own 'batch holder' (I guess the standard batch functionality)?
And a question I've intressted to: does only the client crashes which is
executing the batch jobs or that the AOS services really aborts? The question
is because you are writing about a 'batch server'.
All in all sounds very strange. I suppose to write a testfile while
executing your jobs with the current memory usage (may be before and after
ever job) to find out if the Memory is really keept high over several jobs
(as you suggest) or if just one job makes the trouble (may be a very large
file with more than 10000 orders).
If only one job is the problem, there are some nice ideas to solve the
problem (I would post if you could confirm that this would be the problem ...)
Let me know if you can get some analyzes about memory usage over several jobs.
See you later ..
Patrick
"Kenny" wrote:
> Hi luegisdorf,
>
> I don't use an .orig() statement in my code.
> Every time a batch job is ran, there is a ttsBegin and a ttsCommit. so there
> are a lot of transactions but it is not the db, but the aos which crashes :|
>
> greets,
>
>
>
> "Luegisdorf" wrote:
>
> > Hi Kenny
> >
> > The only Axapta memory leak I know about is if you use command like "table =
> > otherTable.orig()" and the 'otherTable' wasn't selected before (look at
> > discussion with the subject "This.orig gives out of memory" for more
> > information).
> >
> > The other problem (but this is concerning the SQL-Server and not Axapta
> > itself); did you execute all jobs bouded in one transaction (tts), or is
> > every job a single transaction? As more update, delete and insert-commands
> > are in the transactions as more resources are needed by the DB-Server.
> >
> > May be that helps a little bit.
> >
> > Best regards
> > Patrick
> >
> > "Kenny" wrote:
> >
> > > Hi all,
> > >
> > > I have a sales importing process and it basically does the following :
> > >
> > > 1 - Read a file that contains sales order in csv values.
> > > 2 - Per line, it creates a salesorder.
> > > 3 - Per salesorder, a 'finish' process is done (some calculations) but this
> > > is put as a batch job in the day batch (processed by the day batch server)
> > >
> > > So they start reading files with for example 1000 orders in it. and
> > > everything runs fine when creating the orders, creating the batch jobs. And
> > > from that point, the batch server processes all these jobs.
> > >
> > > The problem now is that the batch server runs out of memory after a couple
> > > of hundred jobs. I don't understand this because after each job, the
> > > resources are freed right??
> > >
> > > if anyone has an idea, I would very much like to hear about it :)
> > >
> > > thx in advance,
> > > greets,
> > > Kenny
> > >
> > >