Welcome to ffxiv-knights-ivalice.org
Publishing a Book is the Final Frontier
(book publishing)
Many authors begin their careers intending to publish a book. Book publishing is a difficult task to accomplish. It takes many months of work and extensive preparation. A book involves intricately woven ideas. A book is a project. In that project is contained many other projects. Most people are not prepared for the intensive process that is involved in creating a full, coherent book. If book publishing is something that you are interested in trying, there are a few things you should keep in mind. First, writing a book is like nothing you have ever done before. It will take extensive and intensive work and development. It will also probably include much of everything you know, and more. Read on for more clarity.
Uncharted Territory
Book publishing is like a new land that has never been explored before. Of course, there are several book authors out there. They have been around for centuries. Unlike other areas of expertise though, book writing is not something that will be the same process for several different people. As you set out to write a book, you will be able to follow some basic guidelines, but getting your ideas from your head to the page will be an invention of your very own. Not only will you have to get the information onto the page, but you will have to write in a way that thousands or even millions of readers will be able to relate to and understand. Again, that will be a process that will take experimentation and trials. As you begin the process of writing your first book, as well as subsequent books, expect to work and rework.
One Idea Is Not Enough
Part of the reworking process is the changing of direction within the writing. Many beginning writers aspire to book publishing. They have an idea and vague plan to turn the idea into book. Picture your first grader telling you that she wants to write a book about horses. There is certainly enough information that people want to know about horses to fill several books, but the vague idea is not enough for an adult writer to create publishable work. To write a book, you will need to start with a topic. You may or may not be an expert on the subject. After you have the first vague ideas, you will need to start asking yourself questions. Answering those questions will hopefully lead you to more questions, and so on. Even if your original idea is completely unique and will lead you to write new information that the world does not yet have access to, you will need to add to that original idea for an intriguing finished product. If you are not an expert, or if you do not already know any new information, it will take even more time and effort in order to produce a unique piece of writing. Fiction is the same as non-fiction. Many stories have been told before. If you want to publish, you will need to come up with an engaging and new journey for your readers to take.
Using Previously Published Work
Now that we have covered the requirement for intricate and new ideas, there is also room in a book for old ideas. Your readers will need a starting place within your writing that is familiar and known. As you are putting together your ideas for a complete book, you will probably publish smaller pieces of work in magazines and newspapers. It is ok, as long as you cite yourself, to reuse some of that work. In that way, you can be publishing as you go along while still making progress towards your end goal in book publishing. After several months or even years, you will have poured out your effort and knowledge into a finally completed and whole book.
Web Hosting - Bandwidth and Server Load, What's That? Two key performance metrics will impact every web site owner sooner or later: bandwidth and server load. Bandwidth is the amount of network capacity available, and the term actually covers two different aspects. 'Bandwidth' can mean the measure of network capacity for web traffic back and forth at a given time. Or, it sometimes is used to mean the amount that is allowed for some interval, such as one month. Both are important. As files are transferred, emails sent and received, and web pages accessed, network bandwidth is being used. If you want to send water through a pipe, you have to have a pipe. Those pipes can vary in size and the amount of water going through them at any time can also vary. Total monthly bandwidth is a cap that hosting companies place on sites in order to share fairly a limited resource. Companies monitor sites in order to keep one site from accidentally or deliberately consuming all the network capacity. Similar considerations apply to instantaneous bandwidth, though companies usually have such large network 'pipes' that it's much less common for heavy use by one user to be a problem. Server load is a more generic concept. It often refers, in more technical discussions, solely to CPU utilization. The CPU (central processing unit) is the component in a computer that processes instructions from programs, ordering memory to be used a certain way, moving files from one place to the next and more. Every function you perform consumes some CPU and its role is so central (hence the name) that it has come to be used as a synonym for the computer itself. People point to their case and say 'That is the CPU'. But, the computer actually has memory, disk drive(s) and several other features required in order to do its job. Server load refers, in more general circumstances, to the amount of use of each of those other components in total. Disk drives can be busy fetching files which they do in pieces, which are then assembled in memory and presented on the monitor, all controlled by instructions managed by the CPU. Memory capacity is limited. It's often the case that not all programs can use as much as they need at the same time. Special operating system routines control who gets how much, when and for how long, sharing the total 'pool' among competing processes. So, how 'loaded' the server is at any given time or over time is a matter of how heavily used any one, or all, of these components are. Why should you care? Because every web site owner will want to understand why a server becomes slow or unresponsive, and be able to optimize their use of it. When you share a server with other sites, which is extremely common, the traffic other sites receive creates load on the server that can affect your site. There's a limited amount you can do to influence that situation. But if you're aware of it, you can request the company move you to a less heavily loaded server. Or, if the other site (which you generally have no visibility to) is misbehaving, it's possible to get them moved or banned. But when you have a dedicated server, you have much more control over load issues. You can optimize your own site's HTML pages and programs, tune a database and carry out other activities that maximize throughput. Your users will see that as quicker page accesses and a more enjoyable user experience. Web Hosting - Managing Disk Space Few things are less exciting than managing the disk space that always seems to be in too short a supply. But few things are more important to the health and well being of your site. The most obvious aspect of managing disk space is the need to have enough. If you have only a few dozen web pages, that's not an issue. But as the amount of information (web pages, database content and more) grows, the quantity of free space goes down. That's important for two reasons. All permanent information on a computer is stored on hard drives. Temporary information is often stored in memory only. The two components are completely separate, though they are sometimes confused with one another. As the amount of free space on the hard drive decreases several effects occur. Here's one way to picture them... Imagine you had a table with a certain area and you lay out playing cards on the table. At first, you lay them out in order, the 2 at the side of the 3, then 4, and so on. But then you pick up one or two cards from the middle and discard them. Then you add some more cards. Pretty soon things look pretty random. Now cover the cards with a big opaque sheet of paper. You want the cards to appear in order when displayed to someone. A special robot could be designed to always pick up the cards from underneath the sheet in order. Or, it could slide a hole in the sheet over the cards to display them in the correct order (2, 3, 4, ...), no matter what order they are really in. That's similar to how the operating system always shows you information in a sensible way, even though it's actually stored randomly. Why should you care? Real files are stored in pieces scattered around the drive wherever there is space for them. The more free space there is, the quicker the operating system can find a place to store a new piece. That means, if you delete the junk you no longer need (and free up more space) the system actually runs quicker. It helps create space you might need, and allows the operating system to store files for you faster. But there's a second effect. As you delete old files or change them, the pieces get more and more scattered. It takes the 'robot' longer and longer to fetch or display the 'cards' in order. Existing files are fetched and put together 'on the fly' (say, when you request a graphical page or a list of names). But, it takes longer to put together the web page when there are more scattered pieces. So, the other aspect of managing disk space is to keep the pieces of the files more or less in order. A utility that does that is called a 'de-fragger' or de-fragmentation program. You can request that a system administrator run it, or if you have the authority, you can run it yourself. That keeps the 'cards' in order and allows for quicker access to them. So, managing disk space involves chiefly three things: (1) keeping enough space to store what you need to store, and also (2) keeping enough free space to make new file storage quick and (3) making old file retrieval fast by keeping things orderly. When only a few files are involved the benefit isn't worth the effort. But as the number and size of the files grow, to thousands of files or several gigabytes of data, the effect becomes more noticeable. Keeping things organized then makes a significant difference in performance. Much of this can be automated using utilities. Some will delete files in a certain folder older than a certain date. A de-fragger can be set to run automatically during times of light usage, or quietly in the background at all times. Discuss the options with your system administrator and help him or her do the job better by keeping your house in order. You'll benefit by having a better performing web site. |