Sql Server Get Database Size in Gb

  1. Home
  2. Databases
  3. Microsoft SQL Server

Hi Team

We need to increase the SQL server database size. Seems the process is simple but I just want to make sure I am following the correct procedure.

Can I do this on live server?

Do I need to follow any other prerequisite except taking the backup of database?

Thanks

Sazzad


Popular Topics in Microsoft SQL Server

The help desk software for IT. Free.

Track users' IT needs, easily, and with only the features you need.

21 Replies

Alec6638
Alec6638 This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 28, 2020 at 20:04 UTC
Microsoft SQL Server expert

Yes, you can do this on a live SQL server, no downtime.  Depending on the existing size and new size, there may be a time period where SQL updates may be blocked.

However - you should look at your existing settings to see why you need to increase the database size?
Also check the free space on the drive where you have the database files.  Taking a backup is always a good idea before making database changes.

Can you give us a screencap of your database properties showing the current 'Size' and 'Autogrowth/Maxsize' values?

Alec6638
Alec6638 This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 28, 2020 at 20:18 UTC
Microsoft SQL Server expert

Can you show the properties using SSMS (SQL Server Management Studio) ?  Something like the following:


Sazzad_Jca

Here is


Alec6638
Alec6638 This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 28, 2020 at 20:27 UTC
Microsoft SQL Server expert

Your databases are set to autogrow by 10% when needed.
Why do you think you need to increase the size of your databases?

Sazzad_Jca

Recently we faced an issue OutOfMemory but later we discovered this was not for memory or disk space. This was because of GL transaction logs which resolved after disabled the logs. Now we are thinking should we increase the database size to avoid any issue in future. Is there any standard settings how much free space we need to have for database size to avoid any performance issue?

Alec6638
Alec6638 This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 28, 2020 at 20:45 UTC
Microsoft SQL Server expert

MS SQL Server databases always have some 'free space' in them.
The first database properties page you opened (when opening properties for a database) shows you that (on the General page).

When this free space is consumed (by data or indexes or other),then SQL will automatically grow the database size by the value specified in the properties page we already looked at.  You do not do this; MS SQL does this in the background.

If you think you can estimate your total database size requirements, sure, you can put that in as the 'Initial Size'.
This will then reduce the number of times SQL has to automatically increase the database file size.

You may want to look at this link: estimating-disk-space-requirements-for-databases

There are many ways to try to estimate your database size requirements, no one method works for everyone.

As to log file size - this depends on the type of recovery method used (Full, Simple, or other), database activity, and transaction log backup frequency.
This is another discussion subject where an answer in this forum will never be sufficient for everyone.

Have you contacted your Epicor representative for their recommendations?

gfraenk

Your GL transactions are saturating your EpicoreLive_Log, which is the max limit size of 2097152 Mb

Solution: increase the limit to e.g.  3,145,728 Mb on EpicoreLive_Log  and if you have "FULL RECOVERY" on your database, enable transaction log backups every x minutes if you need to recover from a disaster... depending your business  recovery needs.

Sazzad_Jca

Alec6638​ great thanks for sharing all these valuable information. I have few more questions to know. What is the best practice in that case increase the initial size based on estimation or leave it increase auto? If we leave it auto increase is there any impact on performance? Is this 10% free space is good enough to avoid performance issue?

Sazzad_Jca

gfraenk​ The log file size not yet exceeded the maximum limit. The current size is 1.4GB

leonardpothier
leonardpothier This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 29, 2020 at 12:20 UTC

gfraenk wrote:

Your GL transactions are saturating your EpicoreLive_Log, which is the max limit size of 2097152 Mb

Solution: increase the limit to e.g.  3,145,728 Mb on EpicoreLive_Log  and if you have "FULL RECOVERY" on your database, enable transaction log backups every x minutes if you need to recover from a disaster... depending your business  recovery needs.

While I agree that Transaction Log backups should be being done to manage the size of the Log DB file I'm not sure that they are really hitting the limit listed there, having a SQL Log file hitting that limit at 2 TB would require a log file that hasn't been backed up in a long time. I have the same limit on mine (suspect it is just a default), run the Transaction Log backup every 10 minutes and my log is 6.8 GB - of which most is likely empty and due to expansion from previous version upgrades in Epicor.

If they are hitting that limit the main DB file must either be absolutely massive (my 6.81 GB is with a 180 GB main DB) OR they aren't doing any Transaction Log backups in which case I almost guarantee they will be unable to now as it will time out and fail when trying to merge into the main file. I've mistakenly had the log file reach 300 or so GB in a test environment and ended up just wiping it out because it would not merge in during the backup.

Other than the good practice of doing the transactional backups is to make sure if you are running E10+ that in your IIS App Pool settings on your Application Server that you have 32 bit turned off. Epicor will tell you 10+ can't even run with that on but it can and will run fine until you hit about 3.8 gb used on the App Pool, at which point it will crash and complain about memory.

leonardpothier
leonardpothier This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 29, 2020 at 12:25 UTC

If the 32 bit is already turned off then I would run with the log turned back on and watch the server resource monitor, if it goes past 4 gb for the app pool and still crashes you likely don't have enough memory on the server itself. If that is the case I would be contacting Epicor and/or a 3rd party consultant who has experience with Epicor and performance optimization as you may be able to tweak settings or you may just have to increase the amount of RAM.

Sazzad_Jca

leonardpothier​ in our server 32 bit already turned off. The log file size is 1.4GB. We have 85% available memory on Apps server and 55% available memory on Database server. Still we had issue of OutOfMemory error which has been resolved after disabling the GL transaction logs.

leonardpothier
leonardpothier This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 29, 2020 at 12:37 UTC

But what is your RAM situation? Typically memory refers to RAM not disk space.

gfraenk

I took from your description of the problem " This was because of GL transaction logs which resolved after disabled the logs " , I assumed that you have a lot of transaction hitting the database in a short period of time. I also assumed that you are running in simple recovery mode... increase the transaction log size, will take care of high frequency of transactions, without disabling it.

Sazzad_Jca

Gfrank​ this might be the issue lots of transaction in a short period of time. I am not familiar with this but Epicor replied us it is not recommended to keep the GL transaction logs enable.

Sazzad_Jca
leonardpothier​ I actually mean RAM on my above response. Both Apps an Database server are VM and we have 49GB RAM allocated on each server. 85% and 55% RAM available on Apps and database server.
Fessor
Microsoft SQL Server expert

This is a very confusing thread.

You wanted to increase the database size because you believe that might get rid of an "out of memory" issue. However, you resolved the issue after the vendor said to disable the GL transaction logs. I don't see how these two things have anything in common.

What is it that I don't understand?

Larry Shanahan
Larry Shanahan This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 29, 2020 at 13:09 UTC
Microsoft SQL Server expert

The size of database and/or t-log files has absolutely nothing to do with out of memory errors.  Most likely what you are seeing is that a client application (SSMS for example) cannot allocate a sufficient amount of memory as a buffer to hold a query result, even if there is additional free memory on the system.  Many client applications grab a certain amount of memory for their needs and generate the OutOfMemory exception when that is filled.  One classic example of this is 32-bit Excel which has hard memory limits (2 to 4 GB or something like that) and generates out of memory exceptions when those limits are hit even if more memory is available on the host system.

In other words, the problem has more to do with Epicor.

Sazzad_Jca

Fessor​ sorry for the confusion I actually created this thread to know about increasing the size of SQL database.....The GL Transaction issues came up because of this issue we are now planning to increase the SQL database size. You can ignore the GL transactions topics here which is already fixed. I just wanted to know some best practice about the SQL database sizeAlec6638​ already answered my most of the questions and I have few more questions to know the answers.

What is the best practice in that case increase the initial size based on estimation or leave it increase auto?

If we leave it auto increase is there any impact on performance?

10% free space in database is good enough to avoid performance issue?

Larry Shanahan
Larry Shanahan This person is a Verified Professional
This person is a verified professional.
Verify your account to enable IT peers to see that you are a professional.
May 29, 2020 at 13:23 UTC
Microsoft SQL Server expert

Best practice is to set a fixed size, monitor growth, and proactively grow manually as needed during a maintenance interval.  In other words, it isn't a set-and-forget operation.  You create enough space to do you for say, a quarter, and set a small fixed-size autogrowth for unexpected heavy activity.

A growth event is a blocking operation - everything gets held up while it is in progress.  So you'll want to make these as few and far between as possible.

For a fuller explanation, see here.

HTH

PS - NEVER do autogrow by percentage.  ALWAYS a fixed size.

This topic has been locked by an administrator and is no longer open for commenting.

To continue this discussion, please ask a new question.

Sql Server Get Database Size in Gb

Source: https://community.spiceworks.com/topic/2274706-increase-sql-server-database-size

0 Response to "Sql Server Get Database Size in Gb"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel