Let me preface with, I am in no way an IT professional, so kid gloves and condescending are very much appreciated. I use azure backup to backup my the file system of my server, a document database, and a paradox database. All in all, we are
talking a little over a terabyte of data. According to backup agent I have 3.5 terabytes of data (again, in total all of my data including items that aren't backed up are 1.1TB) I am currently running a daily backup and I am almost a gigabyte in and
still crunching since yesterday. No one has even touched server, but yet my file system has 1gb in new data since yesterday? No updates have been installed, no one has even logged onto it. I lack the understanding to see how this is happening.
I have a tiny small business server that I would just like to be able to restore if I had a cataclysmic crash, but that luxury is costing me $230 a month. Am I doing something wrong in my retention policy that is creating multiple copies of the same
data? Any thoughts from the community on how this is happening? If it is a retention issue, is there a way to lean strip my backup data to just the last 10-30 days or so, because the costs of this service and rather poor design are really leading
me towards just switching to a consumer grade backup solution, as this feels like a very poor alternative to any of those that I have used in the past.
↧