I am unclear about the Cloud and file structure-based storage. We are developing a site that will store user files in nested file storage paths (think Server.MapPath( ) ) in conjunction with data in SQL. i.e. PDFs will not be in SQL, but as files in file folders. This style works fine with DiscountASP, but at prices of $10/GB vs $10/TB at Google, this is a problem. How about Everleap? I am not sure that one can access file folders like this in an Azure Cloud. Perhaps storing the files on Google Drive would be the solution, but it is more complex code to start. (and then, perhaps the app just uses DiscountASP again) And also, there seems to be a current 30GB cap on Everleap. Thanks, Greg
I agree, things are more complicated these days. The issue has to do with the specific needs for the storage. In some cases, space is the key and speed is not a factor. For example, I have terabytes of multimedia that I don't necessarily need constant fast access to. As long as they stored somewhere, I'm happy. They can be zipped and offloaded to another slower, cheaper, network. On the other hand, the files that make up my website - I want access to, and I want it to be fast. This is the same reason I use an SSD locally as my C drive for Windows, Visual Studio, etc and have another larger slower drive on my machine for things like music. The type of storage we offer is intended for instant access. What works best will depend on your application. If your users are uploading large documents to share with their vendors (Once time access to a large PDF of real estate documents for example), you may be able to get away with a slower solution. If they are uploading photos and want to see their gallery, I would stick with a faster storage solution. The good news is that it there are a lot of choices. From super fast, to super slow. Amazon even has a service called glacier where the data is removed from disk all together and placed on magnetic media. It's super cheap for a lot of space, the down side is that you have to request the data and the wait is in hours.
So it turns out this is a fairly serious concern for me also - I am getting ready to try out Everleap with my site that accepts uploaded avatar image type files from users and saves them to a directory structure that is maintained on the server. It appears that with Azure Pack there is a directory structure something like c:\inetpub\temp\DWASFiles\sites\xxxx-xxxx\VirtualDirectory0\site\wwwroot - if I were to try and write to folders beneath wwwroot - would I be OK? I wouldn't lose files that get saved by my application would I?
The directory structure for an account is like \- -\DatabaseBackup -\Site -\wwwroot The path the web server is pointing to is wwwroot. Your application would have permission to write to anywhere within the root.
So in order to write to somewhere beneath the root I need a full path - something typically retrieved by calling Server.MapPath() in ASP.NET appplications - which typically yields something like C:\Inetpub\wwwroot\websiteFolder\ - which in non-azure environments - never changes. Could I be certain if I were to store a path (say, as a web.config setting) that references the full path to wwwroot (something like c:\inetpub\temp\DWASFiles\sites\xxxx-xxxx\VirtualDirectory0\site\wwwroot) - it wouldn't ever change at some point in the future?
If I am using a load-balanced server, will the changes under /wwwroot be mirrored to both sites? IE, if we save a PDF, will that file be available regardless of how much I have scaled horizontally, or which specific VM serves a request?
That is correct. When the site spins up on the server, Azure pack will create a virtual directory under c:\inetpub\temp\DWASFiles\sites\xxxx-xxxx\VirtualDirectory0 that points to the shared storage.
Moved this to the editing questions thread mate. To answer your question I dont know of one other than WENBs plans to make one. Maybe someone will see this & make one