Saturday, November 30, 2019

Miltons Satan In Paradise Lost Essays - Fallen Angels, Satan

Milton's Satan In Paradise Lost Critics abroad have argued about who the hero is of John Miltons Paradise Lost: Satan, Adam or Christ, the Son? Since Miltons overall theme stated in the opening lines of Book I is to relate Mans first disobedience and to justify the ways of God to men, Adam must be regarded as the main hero. John M. Steadman supports this view in an essay on Paradise Lost: It is Adams action which constitutes the argument of the epic. Steadman continues: The Son and Satan embody heroic archetypes and that, through the interplay of the infernal and celestial strategies, Milton represents Satans plot against man and Christs resolution to save him as heroic enterprises. Christ and Satan are therefore epic machines. (268-272) Although Satan may be an epic machine, he is best portrayed as the tragic anti-hero of Paradise Lost or, at the very least, a main character who possesses the stature and attributes which enable him to achieve tragic status. In the Greek tradition, the essential components of tragedy are admiration, fear and pity for the hero, who has to display a tragic weakness or flaw in his character, which will lead to his downfall. It might be argued that the flaws in Satans character are such that we should feel no admiration, fear or pity for him, yet he can be seen to inspire these emotions. Satans tragic flaws are pointed out in Book I. They are envy, pride, and ambition towards self-glorification. Satans pride, in particular, is stressed throughout Paradise Lost. In accordance with epic convention, Satan is frequently qualified by Miltons use of the word proud. Virgil used the same device in his epic the Aeneid, in which the name of Aeneas rarely appears without being preceded by pious. Th e most striking visual example of Satans main weaknesses appears in Book IV (89-90) during Raphaels narrative to Adam regarding the battles in Heaven, Raphael refers to Satan as the proud/Aspirer. Proud at the end of one line and Aspirer at the beginning of the next gives equal emphasis and impact to Satans pride and ambition and it is implied that, in Satan, the two characters are inseparable and of equal importance. Milton, in fact, defended his use of blank verse as a suitable vehicle for epic poetry, as opposed to the frequently favored heroic couplet. How then, does Satan inspire the feelings of admiration, fear and pity necessary to a tragic figure? Milton was, undoubtedly, conscious that he was in danger of portraying Satan as too much of a heroic figure and made efforts to belittle him through the use of unflattering imagery, and by highlighting his less complimentary characteristics. Nonetheless, our emotions are still fired. Our first encounter with Satan and his rebel hos ts occurs in Book I when they are recovering from the shock of having been expelled from heaven by the Son after three days of fighting the angels of God. Despite the defeat he has suffered, Satan gains our admiration by displaying resilience in quickly coming to terms with the change in his circumstances, in remustering his forces and organizing the building of his palace, Pandemonium. At the same time he demonstrates his determination not to be defeated and shows true qualities of leadership, persuasively arguing that there is still hope for battle and victory. Satan is convincing in his first speech to Beelzebub, his chief partner in crime, as he declares: What though the field be lost? All is not lost; the unconquerable will, And study of revenge, immortal hate, And courage never to submit or yield: And what is else not to be overcome? That glory never shall his wrath or might Extort from me. (I. 105-111) The language here is particularly powerful and the lines are extremely weighted, underlining Satans resolution. He similarly instills renewed resolve in his followers to challenge God and hope of regaining their former state, claiming that they are now better placed to contend because there is not fear of division in their own ranks (II.11-42). He then gives his supporters the opportunity to speak their minds as to whether to engage in open warfare or in guile to achieve their end; although ultimately

Tuesday, November 26, 2019

Should Public Transport be Free of Charge

Should Public Transport be Free of Charge Free Online Research Papers Every one of us probably used public transport once in his lifetime. If it was a train a bur or a subway, we have to pay for it, but is that really necessary? It is a theme, which is often discussed, but I am convinced that if public transport is free we would have a better world. It should be free of charge to give everyone the opportunity to get to their destination. It is true that many of us would not prefer to take public transportation if given a choice, the more people we have taking buses, trains, and subways, the less people we have on the road. If more people would use public transport, we would have decrease traffic, noise pollution and greenhouse gas emissions. The crucial fact is that we live in a society where cars are really required, but it would be so much better to use the bus or the train. For example, if I need to go to the University I always take the train and the bus. So a lot more people would use public transport and just leave their cars home. All of us want a cleaner planet for our future and the future of our children. Furthermore there are a lot of people who do not have that much money to pay for public transport. I know a family back home, who always take the bike to get to the supermarket, which is 5 kilometers far away. They have a hard life and they do not have money to pay for the bus and this makes their life much harder. In addition people can easy get to their work with help of public transport, but especially driving by train is really expensive. So for example a man works 7 hours in a not well played job and into the bargain the work place is far away. So he has to pay that much money for the train, that it is more trouble than it ´s worth. What also must not be forgotten is that driving by train or bus helps the social connection and could also support a better atmosphere if it would be free. I have been to Australia and in Melbourne there are some busses and trains for free. This makes life their so much easier and you fell just better and you are happier, if you drive with a train which is for free. To sum up, you can see that there are a lot of considerable advantages for making public transport free. Of course we need a lot of money to realize this idea, but all in all it would be a better way of live, especially if we look in the future. We all can help to provide a better environment and that is important for the future. I am sure that if public transport would be free, a lot of people would use it and want to help for a better world for all of us. Research Papers on Should Public Transport be Free of ChargeLifes What IfsThe Effects of Illegal ImmigrationBionic Assembly System: A New Concept of SelfTwilight of the UAWUnreasonable Searches and SeizuresBook Review on The Autobiography of Malcolm XNever Been Kicked Out of a Place This NiceGenetic EngineeringStandardized TestingThe Project Managment Office System

Friday, November 22, 2019

The National Popular Vote Plan to Bypass the Electoral College

The National Popular Vote Plan to Bypass the Electoral College The Electoral College system - the way we really elect our president - has always had its detractors and lost even more public support after the 2016 election, when it became apparent that President-Elect  Donald Trump might have lost the nationwide popular vote to Sec. Hillary Clinton, but won the electoral vote to become the 45th  Ã¢â‚¬â€¹President of the United States. Now, the states are considering the National Popular Vote plan, a system that, while not doing away with the Electoral College system, would modify it to ensure that the candidate winning the national popular vote is ultimately elected president. What is the National Popular Vote Plan? The National Popular Vote plan is a bill passed by participating state legislatures agreeing that they will cast all of their electoral votes for the presidential candidate winning the nationwide popular vote. If enacted by enough states, the National Popular Vote bill would guarantee the presidency to the candidate who receives the most popular votes in all 50 states and the District of Columbia. How the National Popular Vote Plan Would Work To take effect, the National Popular Vote bill must be enacted by the state legislatures of states controlling a total of 270 electoral votes - a majority of the overall 538 electoral votes and the number currently required to elect a president. Once enacted, the participating states would cast all of their electoral votes for the presidential candidate winning the nationwide popular vote, thus ensuring that candidate the required 270 electoral votes. (See: Electoral Votes by State) The National Popular Vote plan would eliminate what critics of the Electoral College system point to as the winner-take-all rule - the awarding all of a states electoral votes to the candidate who receives the most popular votes in that state. Currently, 48 of the 50 states follow the winner-take-all rule. Only Nebraska and Maine do not. Because of the winner-take-all rule, a candidate can be elected president without winning the most popular votes nationwide. This has occurred in 4 of the nations 56 presidential elections, most recently in 2000.The National Popular Vote plan does not do away with the Electoral College system, an action that would require a constitutional amendment. Instead, it modifies the winner-take-all rule in a way its supporters say would assure that every vote will matter in every state in every presidential election. Is the National Popular Vote Plan Constitutional? Like most issues involving politics, the U.S. Constitution is largely silent on the political issues of presidential elections. This was the intent of the Founding Fathers. The Constitution specifically leaves details like how the electoral votes are cast up to the states. According to Article II, Section 1, Each State shall appoint, in such Manner as the Legislature thereof may direct, a Number of Electors, equal to the whole Number of Senators and Representatives to which the State may be entitled in the Congress. As a result, an agreement between a group of states to cast all of their electoral votes in a similar manner, as proposed by the National Popular Vote plan passes constitutional muster. The winner-take-all rule is not required by the Constitution and was actually used by only three states in the nations first presidential election in 1789. Today, the fact that Nebraska and Maine do not use the winner-take-all system serves as proof that modifying the Electoral College system, as proposed by the National Popular Vote plan is constitutional and does not require a constitutional amendment. Where the National Popular Vote Plan Stands Currently, the National Popular Vote bill has been passed in a total of 35 state legislative chambers in 23 states. It has been fully enacted into law in 11 states controlling 165 electoral votes: CA, DC, HI, IL, MA, MD, NJ, NY, RI, VT, and WA. The National Popular Vote bill will take effect when enacted into law by states possessing 270 electoral votes - a majority of the current 538 electoral votes. As a result, the bill will take effect when enacted by states possessing an additional 105 electoral votes. To date, the bill has passed at least one legislative chamber in 10 states possessing 82 electoral votes: AR, AZ, CT, DE, ME, MI, NC, NV, OK, and OR. In The bill has been passed by both legislative chambers - but not in the same year - by the states of Colorado and New Mexico, controlling a combined 14 electoral votes. In addition, the bill has been unanimously approved at the committee level in the states of Georgia and Missouri, controlling a combined 27 electoral votes. Over the years, the National Popular Vote bill has been introduced in the legislatures of all 50 states. Prospects for Enactment After the 2016 presidential election, political science expert Nate Silver wrote that, since the swing states are not likely to support any plan that might reduce their influence over control of the White House, the National Popular Vote bill will not succeed unless the predominately Republican â€Å"red states† adopt it. As of September 2017, the bill has been fully adopted only by predominately Democratic â€Å"blue states† which delivered the 14 largest vote shares for Barack Obama in the 2012 Presidential Election.

Wednesday, November 20, 2019

Data Collection Paper Essay Example | Topics and Well Written Essays - 750 words

Data Collection Paper - Essay Example This is where good academic performance usually starts. However, as these students continue to blend with others or as they continue to dwell within the grounds of the academic institution, they encounter certain things that influence their thinking, their outlook and their attitude. In the case of academic performance, there are several factors that can affect students' attitude towards school. Some of these factors include peer pressure, family background or problems, school location or environment, the student's lifestyle and teachers, and other psychosocial reasons. There are also studies that show that ethnic differences can affect a students' attitude towards school. Professor Laurence Steinberg, in one of his publications, stated that ethnic differences causes students to have different beliefs or reactions regarding failing in school. Specifically, he cited Asians as believers that poor performance in school would have negative or unfavorable consequences. Furthermore, he exp lains the effect of peer pressure on a child. Results of his studies also show that a lot of American teens believe that people make fun of those who do well academically. This implies that they would rather not have high grades to avoid being laughed at. Majority of the students also expressed that they never talk about academics or school-related issues with their friends (Edsource Online, 1999). In relation to the g... This teacher factor means how the teacher deals with the students, the teacher's manner of teaching, the teacher's professional and casual relationship with the students, and other relevant and significant elements that might have an impact on the students' performance. This study aims to answer the following questions: 1. What are the factors that can affect a students' academic performance 2. What are the examples of "teacher factor" that have significant effects on a student's performance In addition, at the end of this study, solutions on how to minimize the negative effect of "teacher factor" on students' performance should be provided. Since the research will tackle the different factors, particularly the "teacher factor", that affects students' performance in school, results will be beneficial to other students of different educational level. Through this research, students will learn about how to avoid being affected by such factors, and therefore, begin to develop a more positive outlook about school and help them excel in class. Aside from the students, this research will also be of help to educational institutions by providing them with relevant information on how to improve and develop better relationships among the teachers and the students. This research can also serve as a reference for future researchers. This is going to be a descriptive research which will involve male and female high school students as respondents. The sampling technique to be used will be random sampling. This research will make use of a questionnaire as the data collection method. This will allow the researcher to have a larger sample size and, therefore, obtain more reliable and accurate results. This method is also less

Tuesday, November 19, 2019

The Feminist Critique and the Postmodern Challenge to Anthropology Essay

The Feminist Critique and the Postmodern Challenge to Anthropology - Essay Example Feminism, as an ideal, is the collection of movements, associations, groupings and or establishments that aim at defending, defining and establishing equality in the spheres of social rights, politics and the economy; this pertaining to women. In addition, the ideal promotes the creation and provision of equal opportunities for women in both education and also in employment. Thus, a feminist is a person who’s behavioral and belief systems are based on the ideal of feminism (Fruzzetti 39). From the afore-mentioned feminist movements, associations and groupings emerged the Feminist theory, which aimed at understanding the causes and reasons for the presence of gender inequality. This understanding was based on the examination of women lived experiences and social roles throughout history and into the contemporary 21st Century. From it emerged different theories that touched on a variety of disciplines; this so as to respond and subsequently address issues such as the social cons truct of gender and sex. Some earlier forms of the theory received criticism for their taking into consideration only educated, white middle-class perspectives. As a result, of this criticism, was the creation of multi-culturalist and/ or ethnically-specific forms of the theory (Cott 73). Feminists campaign on the platform of ‘Women’s Rights’ – bodily integrity, reproductive rights (including access to abortion and contraceptives), women’s suffrage, equal pay, right to property and entry into contracts (contract law), and also voting. They seek to protect girls and women from domestic violence, sexual assaults and harassments among other violations. Due to its radical nature, this ideal has attracted its share of both criticism and blessings; this in the form of pro-feminism and anti-feminism ideologies. Feminism and Anthropology As a result of the feminist critique to anthropology, the approach – Feminist anthropology – emerged. It so ught to study cultural anthropology and correct the perceived andro-centric bias within the field. Its origin can be traced to early anthropologists such as E.E. Evans-Pritchard and James Frazer, who both displayed much interest in the notions of marriage and kinship. Women would thus, always appear in their ethnographies. Henrietta Moore, who is a prominent theorist in (the school of thought of) feminist anthropology, though of the opinion that women had been included in anthropological research and theory, was of the view that the problem was not the presence of women in anthropology, but in its representation, interpretation and understanding (Bratton10). According to her, it is how women are included in anthropology that matters. Thus, the challenge, then, was to avail new critical analysis on the existing anthropological literature, including creation of new research that placed the ‘Woman’ in the centre of it. This led to the emergence of self-conscious feminist a nthropology in the 1970s; this as a series of challenges to the male-dominated and biased anthropology. Rayna Rapp, in her work - Toward an Anthropology of Women (1975), was one of the earliest contributors to this emerging school. She argued that women and men experience gender differently; this in reference to the myriad of social markers. The experiences of women were in themselves a legitimate subject for

Saturday, November 16, 2019

Globalisation Is A Trend Which Tends To Benefit The Rich And Hurt The Poor Essay Example for Free

Globalisation Is A Trend Which Tends To Benefit The Rich And Hurt The Poor Essay Globalisation was initiated in the early centuries as a way of integrating the world of economists, business and political activities which were focussed on cost associated protection. This resulted from the idea of declining international economic integration. This gave birth to several international institutions that were supposed to oversee international trade by removing the barriers to trade. It is thus a process that aimed to be beneficial to all people within a country and in the whole world. However, globalisation is nowadays involving many others activities which are multivariate in approach. It involves economics, social and political dimensions. Globalisation is therefore having various aspects which are affecting the world in different ways. Such aspects include looking for the markets for the products and access to a range of foreign products which are required for their productions. Since the inauguration of globalisation, the industrialist system has recorded an unbelievable number of achievements. The industrial revolution has fetched a new standard of prosperity, shape and comfort to the people of the world over. According to many economists, these accomplishments have been achievable due to a novel institutional framework that supports competitive markets, political freedoms and universal education encouraging objective scientific interactions and allowing social and political criticism, and provides safety to reduce risk and deprivation. Globalisation has led to reducing scarcity and has created a catastrophe of sustainability as susceptibility to the poor to consume exceeds of his capacity and to conserve diversity and control wastes is no longer there. Removing national barriers has exposed rich and ill-equipped peoples to the threats as well as the benefits of free trade and competitive markets. Globalization has affected the poor in communications by reducing cultural diversity and exposed everyone to the temptations of an often selfish and slight international industry. In addition, the demands of competition in the capitalist setting and transformation of workers has implications for stress-related illnesses, family breakdown, and the loss of long-established values of family team spirit which adds cost to the poor. (Pistor, 1997) The internationalization of the market has a direct impact in most important sectors which are mostly dominated by the poor. Mostly, the poor are kept in those sectors which are widely associated with the production but not in the distribution. The poor producers do not benefit from globalisation as the middle class men always makes the best of the un informed poor men in the globalise world. Thus the globalise trade in agricultural and livestock commodities from the poor are lowly paid for. This makes such sectors to be lagging behind in terms of industrialization. By commercializing their natural products, either raw or semi processed, they can achieve a balance of trade superfluous. However, it is in the agricultural markets that rich countries have been stubborn in putting favourable policies for the free trading conditions. (Pistor, 1997) While it is correct that globalization supports free trade among countries on an international level, there are also negative results because some countries try to save their national markets. The main export of poorer countries is usually agricultural goods. It is difficult for these countries to compete with stronger countries that subsidize their own farmers. Because the farmers in the poorer countries cannot compete, they are forced to sell their crops at much lower price than what the market is paying. Thus, mistreatment of foreign impoverished workers results. The worsening of protections for weaker nations by stronger industrialized powers has resulted in the exploitation of the people in those nations to become cheap labour. Due to the lack of safety, companies from powerful developed nations are able to offer workers enough salary to entice them to endure tremendously long hours and unsafe working conditions. The abundance of cheap labour is giving the countries in power motivation not to rectify the inequality between nations. If these nations developed into industrialized nations, the army of cheap labour would slowly disappear alongside development. With the world in this current state, it is impossible for the exploited workers to escape poverty. It is true that the workers are free to leave their jobs, but in many poorer countries, this would mean starvation for the worker, and possible even family members. (Sachs, 2005) Globalisation has led to the shift from manufacturing to service work. The low cost of offshore workers have attracted corporations to move production to foreign countries. The laid off unskilled workers are forced into the service sector where wages and benefits are low, but turnovers are high. This has contributed to the widening economic gap between skilled and unskilled workers. The loss of these jobs has also contributed greatly to the slow decline of the middle class which is a major factor in the increasing economic inequality in the whole world. Families that were once part of the middle class are forced into lower positions by massive layoffs and outsourcing to another country as technologies are bringing about globalisation are changing day by day. This also means that people in the lower class have a much harder time in climbing out of poverty because of the absence of the middle class as a stepping stone. (Sachs, 2005) Globalization is leading to the rise of contingent work. As globalization causes more and more jobs to be distributed overseas, and the middle class people’s declines, there is less need for corporations to hire full time employees. Corporations are less inclined to offer benefits such health insurance, bonuses, vacation time, shares in the company, and pensions, or reduce benefits, to part time workers. Most companies don’t offer any benefits at all. Even though most of the middle class workers still have their jobs, the reality is that their buying power has decreased due to decreased benefits. Job security is also a major issue with contingent work. Moreover, globalisation is weakening of labour union. The excess in cheap labour coupled with an ever growing number of companies in transition has caused a weakening of labour unions in the in the world. Unions lose their effectiveness when their membership begins to decline. As a result unions hold less power over corporations that are able to easily replace workers, often for lower wages, and have the option to not offer unionized jobs anymore. (Humphreys, 2000) On the hand, globalisation has in many ways helped the poor countries to come out of the dungeon of poverty. Through globalisation, poorer countries are given opportunities that allow them to trade freely without facing much challenging rules and regulations in trade transactions. Trade allows individuals to exchange goods and services. Hence, globalisation is helping in people who are poor to access these goods and services in their local markets at reduced prices. The reduction of costs on goods and services as results of globalisation enables the poor to have savings which perhaps would not have possible before. From the research sources, it is indicated that trade for many developing countries increased with the introduction of globalisation. The increase in trade was a bit varied from one country to another. This meant that the benefits of globalisation can be received unequally depending on a country trading power to another. (Sachs, 2005) The globalisation aspects have enabled capital movement hence financial assets have been moved across international borders. The poor benefit from such investments in many different ways. When there is capital investment in the poor areas, trade is opened up which allows people to be employed in such investment enterprises. This creates job opportunities in places which were once inflicted by few job opportunities. The earnings received by the employees are used to raise their living standards hence closing the gap between the poor and the rich. Through capital movement by globalisation aspects, business are opened which brings in foreign exchanges through taxation of some of goods which are entering the country. This helps the country to be self sufficient by replacing foreign aid. It gives transitions that lead to market economies. Multinational corporations which invest in any poor countries, they are found to be more of benefit than the local investors in such states. By comparison, the companies have been building factories and hiring workers from local communities. The employees hired have been found to be living in a more comfortable life as compared to their fellow counterparts in others local companies. This is because they are being well paid by the multinational companies as compared to the home country companies which lack adequate capitals and have low turn over. (Warwick, 2001) Due to globalisation, workers move to where jobs are located. Unskilled poor workers are then given the chances of moving to where unskilled labour is required and vice versa. That is the skilled workers also move to where more skilled labour is required. It has been proved beyond no reasonable doubt that wages communicates the demands for the labour required in a given country. Hence people move from places of low wages to places with higher remunerations. Previous results have shown numerous numbers of people are moving from either developing to developed countries or from under developed states to developing ones. This verifies that globalisation in not benefiting the rich and harming the poor. In fact, the participation in globalisation is making the poor to earn higher wages than those who are poor and do not take part in globalisation. Globalisation brings about competitions which the poor benefits very much than the rich. The rich being the investors are forced to improve in the quality of goods and services which the poor would always enjoy. Companies in competing environments provides better opportunities for qualified persons who in turn produces better services and quality goods in return for higher salaries. In addition, countries that are poor get the benefits of having their poor workers to travel abroad where they earn higher wages and then send the money back to the home countries. This money is pumped into the economy of the poor nation where many more people can get the daily breads and raise their living standards. (Sachs, 2002) Globalisation is spreading with technology. The poor is being given challenges to explore in the new technologies which are of vital importance in business investments. The installation of new equipment and technologies in poor countries are key factors to alleviation of poverty but not a way of impoverishing the poor to his dismal points. Such technologies with globalisation help the poor enhance his production methodologies, management techniques and all general mitigations of work. As I conclude, when the rich get richer and the poor becomes poor, it is not a result of globalisation but the fact is that the poor is in fear of embracing the new methods of globalisation for alleviation of poverty conditions. The perception of globalisation a way of exploiting technique of the poor has left many countries and people in poorer levels than they should be. (Humphreys, 2000) This is because the globalisation is helping people to realise the free movement of goods and services are comparatively much beneficial in terms of uplifting the living standards of the people rather than focusing on minute disadvantages of the trade. Wealthy countries are much concerned with the helping the poor through job provisions as well as aid disbursement to the poor one. The provision of the poor with aids has major impacts on their health of the poor as healthy people are able to perform better than unhealthy persons. Due to globalisation, the people leaving as poor people in the world have decreased considerable. People who are leaving earning less than two dollars are those who have despised the practices of globalisations as they are being faced with many problems. The percentages of the people in global countries are much lower than those in ant –globalisation sects. So globalisation has benefited both the rich and the poor. The rich has been in able to get market for his products and services. He has also been a position to get opportunities to invest for the future to enhance world production which benefits the whole world. The poor have more advantages in that the rich provides means to the poor through whom he can lift up his leaving standards and escape the poverty situation by learning the new technologies and exploiting them. (Sachs, 2005) Reference: Humphreys, M. (2000): Escaping the Resource Curse. Columbia University Press Sachs, J. (2005): The End of Poverty: Economic Possibilities for Our Time: Penguin Press Sachs, J. (2002): Resolving the Debt Crisis of Low-Income Countries Warwick, M. (2001): the Strategic Significance of Global Inequality: University of Chicago Press Pistor, K. (1997): The Rule of Law and Economic Reform in Russia. West view Press

Thursday, November 14, 2019

Prions : The Infectious Protein Agent :: Biology Mad Cow Creutzfeldt Jakob Disease

missing graphs What causes Mad Cow Disease? Prions. Prions are also behind other neurodegenerative diseases such as the Creutzfeldt-Jakob disease, Kuru, Gerstmann-Straussler-Scheinker disease and some forms of fatal insomnia. These are all prions diseases that have been found to exist in humans. The prion disease for cattle is what we know as the Mad Cow Disease. Prions also exist in other animals such as sheep, mink, mule deer, elk, cats, and some others. So what's so special about prions? Unlike other neurodegenerative diseases that are caused by the misfolding of proteins, altered proteins, abnormal gene splicing, improper expression, or ineffective clearing of proteins which slowly leads to disease by accumulation, prions cause disease by acting as an infectious agent. One abnormal prion protein is enough to turn all the normal prion proteins present into itself. How do prions do that? Scientists are still unsure of how exactly one protein is capable of turning another protein into itself. Many experiments are conducted to help shed light on its mysterious capabilities. In this website, we hope to explain one of these experiments that involved the effect of pH on the structure of prion proteins. Wait...Prions? Prion Protein? Which is which? Prions is the name assigned to infectious protein agents. Prion protein (PrPC) is the normal cellular protein that can become an infectious agent. The Official Mad Cow Disease Home Page The prion is a newly discovered pathogen that is vastly different from the known pathogens of today; namely viruses and bacteria. Unlike the bacteria, no antibiotics can cure prions. They are not typical of a prokaryotic organism or a eukaryotic organism, all that is present in this pathogen is the protein PrPSc. This is the mutated form of the protein PrPC, which is encoded by a chromosomal gene. These two proteins differ in their spatial protein structures and their susceptibility to enzyme digestion. PrPC is completely destroyed in enzyme digestion, whereas PrPSc is resistant to any form of digestion.Viruses usually have nucleic acid, protein, and other constituents that aid in the creation of more progeny viruses. As far as prions are concerned, they multiply by infecting the PrPC protein and turning it into a complex such as itself, the PrPSc protein. Prions exist in multiple molecular forms whereas viruses exist in a single form with disti nct ultrastructural morphology.Another difference between the virus and the prion is that viruses almost always provoke an immune response in the host that it is infecting.

Monday, November 11, 2019

Evaluation of Cardiovascular and related Health Interventions in the Uk

Abstract This research proposal focuses on the healthcare initiatives in the UK pertaining to cardiovascular and related diseases. Using journal articles and government reports as secondary sources and comprehensive questionnaire based interviews and direct telephonic surveys as the primary input, the study aims to evaluate the preventive health programs to ascertain their effectiveness. These results would serve to help in adapting polices suitably so that the ideology of preventive care could be translated into a pragmatically feasible approach. Introduction The National Health Service has transformed phenomenally over the last two decades and the UK government has been at the forefront of health policies driven with a preventive focus. Cardiovascular diseases are still the number one killers in the UK. (BHF, 2010) The rapidly aging national population is set to cause a severe burden in terms of allocation of funds and quality of treatment delivery in the coming years. Let alone the burden of caring for an aged population, the general health level of the working age population is also on a decline. Around 2.6 million people are already on the government incapacity benefits while every year almost 600000 people are claiming these benefits. The costs of health related absence from work in the UK is estimated at a staggering ?100 billion which is equal to the entire annual costs of the NHS. [DH, 2008] There is a pressing need and a sense of urgency to plan for the management of this growing burden. This study aims to examine some of these health care initiatives and policies aimed at preventing cardiovascular and related diseases and evaluate their effectiveness. The government has created lot of independent bodies that check the progress of these health initiatives and assess their health impacts. This study will include a literature review of both local and nationwide preventive interventions. Academics and research workers would be consulted to get their perspectives of these programs and their evaluation. The objective of the study is to assess the effectiveness of these interventions and the evaluation procedures in order to provide a strategic focus for the future. Literature ReviewWhy the need for a Preventive Focus on Health programs?There is an overwhelming burden on the NHS to deliver quality healthcare when there are severe financial constrains. There is an urgent need to shift from ideology to pragmatism. New health perspectives are called for. Prevention is better than cure and the financial benefits of a preventive approach are dearly welcome for the cash constrained NHS. One of the significant achievements from a nationwide strategic preventive health program is the National Service Framework initiated program for cardiovascular disease prevention and treatment in 2002. Within 2 years of the implementation of this nationwide initiative a significant decline in cardiovascular mortality rates (40%) among the population aged below 75 years was reported. It is estimated that the Preventive drive from NSF saved around 22,000 lives in 2007. (DH, 2007) Life style disorders and physical inactivity have contributed to a significant rise in Obesity, diabetes and other health risk factors for cardiac diseases. (Allender et.al, 2007) In a recent health survey of England, acceptable physical activity levels were reported at 39% and 29% for men and women respectively. Accelerometry data however, revealed a very low percentage of men and women (6% and 4%) met the recommended activity levels. (BHF, 2010)There is an urgent focus on increasing physical activity (Walk to Work Program) as an effective method to counter the obesity epidemic and its cardiac complications. The anti smoking campaigns and the ‘NHS Diabetes’ programs are aimed at mitigating the risk factors for cardiac diseases. The most recent initiative is the Destination 2020 project, which is aimed at improving the awareness of the public in recognizing the commonalities between cardiovascular diseases and related disorders. This project calls for a cardiovascular c oalition (CVC) to put a practically effective and comprehensive preventive approach against heart diseases. (BHF, 2009)Evaluation of Preventive Health ProgramsAs mentioned before evaluation of the effectiveness of health interventions is vital. The growth so far achieved by the NHS has to be sustained and made more effective by adapting them to local communities in order to improve prevention, diagnosis and treatment delivery. There are quite a few government organizations that are entrusted with overseeing the implementation and auditing of the preventive health intervention programs. This study will focus on some of these evaluations. The National Institute for Health and Clinical excellence (NICE) offers evidence based practical guidelines to the implementation and assessment of such programs. The recently initiated NIHR Public Health Research program is designed to evaluate current programs and offer evidence based approach for future. In particular, the NIHR research program focuses on the practicalities and to address issues such as social inequalities in the implementation of a health initiative. (NHS, 2011) Similarly the NHS Health Scotland evaluates preventive interventions in the country and provides extensive reports pertaining to the programs locally. Furthermore, five independent research bodies namely Fuse, DECIPHer, ‘Northern Ireland Centre of Excellence for Public Health’, CEDAR and UK Centre for Tobacco Control Studies provide research based evidence for implementation and evaluation of public health programs. (NOO, 2011) While some programs have been touted as vastly successful others are found to be lacking in terms of practical results. This study aims to have a closer look into these disparities and arrive at some improvements.Research QuestionHow effective are health organizations in designing and implementing preventive health initiatives to control Heart diseases Is the strategic drive by the UK government health depart ment to control cardiac diseases by preventive programs such as anti smoking campaigns, Physical activity programs, and Obesity and diabetes control programs effective and bearing resultsThere are certainly differences of opinion regarding the answers to these questions. This research seeks to more closely examine the relevant issues and answer these questions. MethodologyResearch StrategyThis research is based on an objective, scientific methodology. This research proposes to use both primary and secondary sources. Articles from databases of published research materials and government publications constitute the secondary sources (Wrenn et. al 2006). These would be obtained by using appropriate keyword based search. The combination of keywords would constitute effective exclusion criteria so that unrelated material could be avoided. For the primary sources of information many government agencies and people responsible for overseeing and evaluating health intervention programs would be contacted over phone and email. The questionnaires for these respondents would be designed based on research findings obtained from the secondary research. In this way it is possible to ascertain if the primary sources of information agree with the conclusions of the previous research findings. Rating scales would be used to assess the effectiveness of he alth interventions and semi structured questionnaires would be employed to gather information directly from the persons who supervised these programs. (Babbie 2010) The advantage of using semi structured questions is that it would help extract more information about the success or failure of these programs.Access/Ethical IssuesTelephonic interviews and email based questionnaires are the main access methods in this study. Since the people currently representing an organization would be contacted and their opinions about current and past health interventions programs will be gathered, it would cause an ethical predicament. The opinions of people in charge of these programs might affect or jeopardize their career with the organization if their views about these programs are unintentionally leaked out. A confidentiality agreement would therefore be necessary to encourage people to talk openly and disclose information related to the health projects.Limitations of the ResearchIt takes a lot of time for large-scale projects to take effect and hence evaluating health projects could have to be undertaken over an extended period. Especially, since some of the new cardio vascular health programs such as ‘Destination 2020’ have just commenced it would not be possible to assess the effects of these and other such new initiatives in this study. Conclusion This proposal outlines the research question concerned with the evaluation of health projects aimed at preventing cardiovascular and other related diseases in the UK. The literature review uses reports from government organizations and clearly highlights the immediate need for intervention and a strategic focus for controlling cardiac health problems across the country. The research methodology is useful to evaluate these programs and to analyze their impact. These results would serve to help us in adapting polices suitably so that the ideology of preventive care could be translated into a pragmatically feasible approach. Time Chart ActivityTime Scale Research Design Planning Review of Literature Research Objectives Preparation of Questionnaires Contact Primary sources Survey Analysis of Data Draft of Dissertation Final Dissertation References Babbie, E R (2010), The Practice of Social Research (12th Edition.), Cengage Learning, Belmont, CA British Heart Foundation (2009), Destination 2020: A Plan for Cardiac and Vascular Health, British Heart Foundation, London. British Heart Foundation, (2010) Coronary Heart Disease Statistics: Behavioral Risk Factors, University of Oxford. Department of Health (2007a), The Coronary Heart Disease National Service Framework: Building for the future – progress report for 2007. DH, (Nov 2008), Improving Health and Work: Changing Lives, Crown Publications. UK National Obesity Observatory (2011) ‘Evaluation Websites’ Viewed Jan 16th 2012, NHS (2011) ‘Research to improve the health of the public and reduce inequalities in health’, Viewed, Jan 16th 2011, http://www.phr.ac.uk/ Steven Allender, Charlie Foster, Peter Scarborough, Mike Rayner (2007), The Burden of Physical Activity related ill Health in the UK, J Epidemiol Community Health ;61:344-348 Wrenn, B, Stevens, R E and Loudon, L (2006) Marketing research: text and cases (2nd Edition), Routledge, UK

Saturday, November 9, 2019

Recovery System Dbms

17. Recovery System in DBMS – Presentation Transcript 1. Chapter 17: Recovery System * Failure Classification * Storage Structure * Recovery and Atomicity * Log-Based Recovery * Shadow Paging * Recovery With Concurrent Transactions * Buffer Management * Failure with Loss of Nonvolatile Storage * Advanced Recovery Techniques * ARIES Recovery Algorithm * Remote Backup Systems 2. Failure Classification * Transaction failure : * Logical errors : transaction cannot complete due to some internal error condition * System errors : the database system must terminate an active transaction due to an error condition (e. . , deadlock) * System crash : a power failure or other hardware or software failure causes the system to crash. * Fail-stop assumption : non-volatile storage contents are assumed to not be corrupted by system crash * Database systems have numerous integrity checks to prevent corruption of disk data * Disk failure : a head crash or similar disk failure destroys all or part of disk storage * Destruction is assumed to be detectable: disk drives use checksums to detect failures 3. Recovery Algorithms Recovery algorithms are techniques to ensure database consistency and transaction atomicity and durability despite failures * Focus of this chapter * Recovery algorithms have two parts * Actions taken during normal transaction processing to ensure enough information exists to recover from failures * Actions taken after a failure to recover the database contents to a state that ensures atomicity, consistency and durability 4. Storage Structure * Volatile storage : * does not survive system crashes * examples: main memory, cache memory * Nonvolatile storage : survives system crashes * examples: disk, tape, flash memory, non-volatile (battery backed up) RAM * Stable storage : * a mythical form of storage that survives all failures * approximated by maintaining multiple copies on distinct nonvolatile media 5. Stable-Storage Implementation * Maintain multiple co pies of each block on separate disks * copies can be at remote sites to protect against disasters such as fire or flooding. * Failure during data transfer can still result in inconsistent copies: Block transfer can result in * Successful completion Partial failure: destination block has incorrect information * Total failure: destination block was never updated * Protecting storage media from failure during data transfer (one solution): * Execute output operation as follows (assuming two copies of each block): * Write the information onto the first physical block. * When the first write successfully completes, write the same information onto the second physical block. * The output is completed only after the second write successfully completes. 6.Stable-Storage Implementation (Cont. ) * Protecting storage media from failure during data transfer (cont. ): * Copies of a block may differ due to failure during output operation. To recover from failure: * First find inconsistent blocks: * Expensive solution : Compare the two copies of every disk block. * Better solution : * Record in-progress disk writes on non-volatile storage (Non-volatile RAM or special area of disk). * Use this information during recovery to find blocks that may be inconsistent, and only compare copies of these. Used in hardware RAID systems * If either copy of an inconsistent block is detected to have an error (bad checksum), overwrite it by the other copy. If both have no error, but are different, overwrite the second block by the first block. 7. Data Access * Physical blocks are those blocks residing on the disk. * Buffer blocks are the blocks residing temporarily in main memory. * Block movements between disk and main memory are initiated through the following two operations: * input ( B ) transfers the physical block B to main memory. output ( B ) transfers the buffer block B to the disk, and replaces the appropriate physical block there. * Each transaction T i has its private work-area in which local copies of all data items accessed and updated by it are kept. * T i ‘s local copy of a data item X is called x i . * We assume, for simplicity, that each data item fits in, and is stored inside, a single block. 8. Data Access (Cont. ) * Transaction transfers data items between system buffer blocks and its private work-area using the following operations : * read ( X ) assigns the value of data item X to the local variable x i . write ( X ) assigns the value of local variable x i to data item { X } in the buffer block. * both these commands may necessitate the issue of an input (B X ) instruction before the assignment, if the block B X in which X resides is not already in memory. * Transactions * Perform read ( X ) while accessing X for the first time; * All subsequent accesses are to the local copy. * After last access, transaction executes write ( X ). * output ( B X ) need not immediately follow write ( X ).System can perform the output operation when it deems fi t. 9. Example of Data Access x Y A B x 1 y 1 buffer Buffer Block A Buffer Block B input(A) output(B) read(X) write(Y) disk work area of T 1 work area of T 2 memory x 2 10. Recovery and Atomicity * Modifying the database without ensuring that the transaction will commit may leave the database in an inconsistent state. * Consider transaction T i that transfers $50 from account A to account B ; goal is either to perform all database modifications made by T i or none at all. Several output operations may be required for T i (to output A and B ). A failure may occur after one of these modifications have been made but before all of them are made. 11. Recovery and Atomicity (Cont. ) * To ensure atomicity despite failures, we first output information describing the modifications to stable storage without modifying the database itself. * We study two approaches: * log-based recovery , and * shadow-paging * We assume (initially) that transactions run serially, that is, one after the other. 12 . Log-Based Recovery A log is kept on stable storage. * The log is a sequence of log records , and maintains a record of update activities on the database. * When transaction T i starts, it registers itself by writing a ;T i start ;log record * Before T i executes write ( X ), a log record ;T i , X, V 1 , V 2 ; is written, where V 1 is the value of X before the write, and V 2 is the value to be written to X . * Log record notes that T i has performed a write on data item X j X j had value V 1 before the write, and will have value V 2 after the write. When T i finishes it last statement, the log record ; T i commi t; is written. * We assume for now that log records are written directly to stable storage (that is, they are not buffered) * Two approaches using logs * Deferred database modification * Immediate database modification 13. Deferred Database Modification * The deferred database modification scheme records all modifications to the log, but defers all the write s to after part ial commit. * Assume that transactions execute serially Transaction starts by writing ;T i start ; record to log. * A write ( X ) operation results in a log record ;T i , X, V; being written, where V is the new value for X * Note: old value is not needed for this scheme * The write is not performed on X at this time, but is deferred. * When T i partially commits, ; T i commit ; is written to the log * Finally, the log records are read and used to actually execute the previously deferred writes. 14. Deferred Database Modification (Cont. ) During recovery after a crash, a transaction needs to be redone if and only if both ;T i start ; and; T i commit ; are there in the log. * Redoing a transaction T i ( redo T i ) sets the value of all data items updated by the transaction to the new values. * Crashes can occur while * the transaction is executing the original updates, or * while recovery action is being taken * example transactions T 0 and T 1 ( T 0 executes before T 1 ): * T 0 : rea d ( A ) T 1 : read ( C ) * A: – A – 50 C:- C- 100 Write ( A ) write ( C ) * read ( B ) * B:- B + 50 * write ( B ) 15. Deferred Database Modification (Cont. ) * Below we show the log as it appears at three instances of time. * If log on stable storage at time of crash is as in case: * (a) No redo actions need to be taken * (b) redo( T 0 ) must be performed since ; T 0 commi t; is present * (c) redo ( T 0 ) must be performed followed by redo( T 1 ) since * ; T 0 commit ; and ; T i commit; are present 16. Immediate Database Modification The immediate database modification scheme allows database updates of an uncommitted transaction to be made as the writes are issued * since undoing may be needed, update logs must have both old value and new value * Update log record must be written before database item is written * We assume that the log record is output directly to stable storage * Can be extended to postpone log record output, so long as prior to execution of an output ( B ) operation for a data block B, all log records corresponding to items B must be flushed to stable storage * Output of updated blocks can take place at any time before or after transaction commit * Order in which blocks are output can be different from the order in which they are written. 17. Immediate Database Modification Example * Log Write Output * ; T 0 start ; ; T 0 , A, 1000, 950; * T o , B, 2000, 2050 * A = 950 * B = 2050 * ; T 0 commit ; * ; T 1 start ; * ; T 1 , C, 700, 600; * C = 600 * B B , B C * ; T 1 commit ; * B A * Note: B X denotes block containing X . x 1 18. Immediate Database Modification (Cont. ) * Recovery procedure has two operations instead of one: * undo ( T i ) restores the value of all data items updated by T i to their old values, going backwards from the last log record for T i * redo ( T i ) sets the value of all data items updated by T i to the new values, going forward from the first log record for T i * Both operations must be idempotent That is , even if the operation is executed multiple times the effect is the same as if it is executed once * Needed since operations may get re-executed during recovery * When recovering after failure: * Transaction T i needs to be undone if the log contains the record ;T i start ; , but does not contain the record ;T i commit ; . * Transaction T i needs to be redone if the log contains both the record ;T i start ; and the record ;T i commit ; . * Undo operations are performed first, then redo operations. 19. Immediate DB Modification Recovery Example * Below we show the log as it appears at three instances of time. * Recovery actions in each case above are: * (a) undo ( T 0 ): B is restored to 2000 and A to 1000. (b) undo ( T 1 ) and redo ( T 0 ): C is restored to 700, and then A and B are * set to 950 and 2050 respectively. * (c) redo ( T 0 ) and redo ( T 1 ): A and B are set to 950 and 2050 * respectively. Then C is set to 600 20. Checkpoints * Problems in recovery procedure as discusse d earlier : * searching the entire log is time-consuming * we might unnecessarily redo transactions which have already * output their updates to the database. * Streamline recovery procedure by periodically performing checkpointing * Output all log records currently residing in main memory onto stable storage. * Output all modified buffer blocks to the disk. * Write a log record ; checkpoint ; onto stable storage. 1. Checkpoints (Cont. ) * During recovery we need to consider only the most recent transaction T i that started before the checkpoint, and transactions that started after T i . * Scan backwards from end of log to find the most recent ; checkpoint ; record * Continue scanning backwards till a record ;T i start ; is found. * Need only consider the part of log following above star t record. Earlier part of log can be ignored during recovery, and can be erased whenever desired. * For all transactions (starting from T i or later) with no ;T i commit ; , execute undo ( T i ). (D one only in case of immediate modification. * Scanning forward in the log, for all transactions starting from T i or later with a ;T i commit ; , execute redo ( T i ). 22. Example of Checkpoints * T 1 can be ignored (updates already output to disk due to checkpoint) * T 2 and T 3 redone. * T 4 undone T c T f T 1 T 2 T 3 T 4 checkpoint system failure 23. Shadow Paging * Shadow paging is an alternative to log-based recovery; this scheme is useful if transactions execute serially * Idea: maintain two page tables during the lifetime of a transaction –the current page table , and the shadow page table * Store the shadow page table in nonvolatile storage, such that state of the database prior to transaction execution may be recovered. Shadow page table is never modified during execution * To start with, both the page tables are identical. Only current page table is used for data item accesses during execution of the transaction. * Whenever any page is about to be written for the fi rst time * A copy of this page is made onto an unused page. * The current page table is then made to point to the copy * The update is performed on the copy 24. Sample Page Table 25. Example of Shadow Paging Shadow and current page tables after write to page 4 26. Shadow Paging (Cont. ) * To commit a transaction : * 1. Flush all modified pages in main memory to disk * 2. Output current page table to disk * 3.Make the current page table the new shadow page table, as follows: * keep a pointer to the shadow page table at a fixed (known) location on disk. * to make the current page table the new shadow page table, simply update the pointer to point to current page table on disk * Once pointer to shadow page table has been written, transaction is committed. * No recovery is needed after a crash — new transactions can start right away, using the shadow page table. * Pages not pointed to from current/shadow page table should be freed (garbage collected). 27. Show Paging (Cont. ) * A dvantages of shadow-paging over log-based schemes * no overhead of writing log records * recovery is trivial * Disadvantages : * Copying the entire page table is very expensive Can be reduced by using a page table structured like a B + -tree * No need to copy entire tree, only need to copy paths in the tree that lead to updated leaf nodes * Commit overhead is high even with above extension * Need to flush every updated page, and page table * Data gets fragmented (related pages get separated on disk) * After every transaction completion, the database pages containing old versions of modified data need to be garbage collected * Hard to extend algorithm to allow transactions to run concurrently * Easier to extend log based schemes 28. Recovery With Concurrent Transactions * We modify the log-based recovery schemes to allow multiple transactions to execute concurrently. * All transactions share a single disk buffer and a single log * A buffer block can have data items updated by one or more transactions * We assume concurrency control using strict two-phase locking; * i. e. the updates of uncommitted transactions should not be visible to other transactions * Otherwise how to perform undo if T1 updates A, then T2 updates A and commits, and finally T1 has to abort? * Logging is done as described earlier. Log records of different transactions may be interspersed in the log. * The checkpointing technique and actions taken on recovery have to be changed * since several transactions may be active when a checkpoint is performed. 29. Recovery With Concurrent Transactions (Cont. ) * Checkpoints are performed as before, except that the checkpoint log record is now of the form ; checkpoint L ; where L is the list of transactions active at the time of the checkpoint * We assume no updates are in progress while the checkpoint is carried out (will relax this later) * When the system recovers from a crash, it first does the following: * Initialize undo-list and redo-list to empt y Scan the log backwards from the end, stopping when the first ; checkpoint L ; record is found. For each record found during the backward scan: * if the record is ; T i commit ;, add T i to redo-list * if the record is ; T i start ;, then if T i is not in redo-list , add T i to undo-list * For every T i in L , if T i is not in redo-list , add T i to undo-list 30. Recovery With Concurrent Transactions (Cont. ) * At this point undo-list consists of incomplete transactions which must be undone, and redo-list consists of finished transactions that must be redone. * Recovery now continues as follows: Scan log backwards from most recent record, stopping when ; T i start ; records have been encountered for every T i in undo-list . * During the scan, perform undo for each log record that belongs to a transaction in undo-list . * Locate the most recent ; checkpoint L ; record. * Scan log forwards from the ; checkpoint L ; record till the end of the log. * During the scan, perform redo for e ach log record that belongs to a transaction on redo-list 31. Example of Recovery * Go over the steps of the recovery algorithm on the following log: * ; T 0 star t; * ; T 0 , A , 0, 10; * ; T 0 commit ; * ; T 1 start ; * ; T 1 , B , 0, 10; ; T 2 start ; /* Scan in Step 4 stops here */ * ; T 2 , C , 0, 10; * ; T 2 , C , 10, 20; * ;checkpoint { T 1 , T 2 }; * ; T 3 start ; * ; T 3 , A , 10, 20; * ; T 3 , D , 0, 10; * ; T 3 commit ; 32. Log Record Buffering * Log record buffering : log records are buffered in main memory, instead of of being output directly to stable storage. * Log records are output to stable storage when a block of log records in the buffer is full, or a log force operation is executed. * Log force is performed to commit a transaction by forcing all its log records (including the commit record) to stable storage. Several log records can thus be output using a single output operation, reducing the I/O cost. 33. Log Record Buffering (Cont. ) * The rules below must be followed if log records are buffered: * Log records are output to stable storage in the order in which they are created. * Transaction T i enters the commit state only when the log record ; T i commit ; has been output to stable storage. * Before a block of data in main memory is output to the database, all log records pertaining to data in that block must have been output to stable storage. * This rule is called the write-ahead logging or WAL rule * Strictly speaking WAL only requires undo information to be output 34. Database Buffering Database maintains an in-memory buffer of data blocks * When a new block is needed, if buffer is full an existing block needs to be removed from buffer * If the block chosen for removal has been updated, it must be output to disk * As a result of the write-ahead logging rule, if a block with uncommitted updates is output to disk, log records with undo information for the updates are output to the log on stable storage first. * No updates should be i n progress on a block when it is output to disk. Can be ensured as follows. * Before writing a data item, transaction acquires exclusive lock on block containing the data item * Lock can be released once the write is completed. * Such locks held for short duration are called latches . Before a block is output to disk, the system acquires an exclusive latch on the block * Ensures no update can be in progress on the block 35. Buffer Management (Cont. ) * Database buffer can be implemented either * in an area of real main-memory reserved for the database, or * in virtual memory * Implementing buffer in reserved main-memory has drawbacks: * Memory is partitioned before-hand between database buffer and applications, limiting flexibility. * Needs may change, and although operating system knows best how memory should be divided up at any time, it cannot change the partitioning of memory. 36. Buffer Management (Cont. ) Database buffers are generally implemented in virtual memory in spite of some drawbacks: * When operating system needs to evict a page that has been modified, to make space for another page, the page is written to swap space on disk. * When database decides to write buffer page to disk, buffer page may be in swap space, and may have to be read from swap space on disk and output to the database on disk, resulting in extra I/O! * Known as dual paging problem. * Ideally when swapping out a database buffer page, operating system should pass control to database, which in turn outputs page to database instead of to swap space (making sure to output log records first) * Dual paging can thus be avoided, but common operating systems do not support such functionality. 37. Failure with Loss of Nonvolatile Storage So far we assumed no loss of non-volatile storage * Technique similar to checkpointing used to deal with loss of non-volatile storage * Periodically dump the entire content of the database to stable storage * No transaction may be active during the dump p rocedure; a procedure similar to checkpointing must take place * Output all log records currently residing in main memory onto stable storage. * Output all buffer blocks onto the disk. * Copy the contents of the database to stable storage. * Output a record ; dump ; to log on stable storage. * To recover from disk failure * restore database from most recent dump. Consult the log and redo all transactions that committed after the dump * Can be extended to allow transactions to be active during dump; known as fuzzy dump or online dump * Will study fuzzy checkpointing later 38. Advanced Recovery Algorithm 39. Advanced Recovery Techniques * Support high-concurrency locking techniques, such as those used for B + -tree concurrency control * Operations like B + -tree insertions and deletions release locks early. * They cannot be undone by restoring old values ( physical undo ), since once a lock is released, other transactions may have updated the B + -tree. * Instead, insertions (resp. el etions) are undone by executing a deletion (resp. insertion) operation (known as logical undo ). * For such operations, undo log records should contain the undo operation to be executed * called logical undo logging , in contrast to physical undo logging . * Redo information is logged physically (that is, new value for each write) even for such operations * Logical redo is very complicated since database state on disk may not be â€Å"operation consistent† 40. Advanced Recovery Techniques (Cont. ) * Operation logging is done as follows: * When operation starts, log ; T i , O j , operation-begin ;. Here O j is a unique identifier of the operation instance. While operation is executing, normal log records with physical redo and physical undo information are logged. * When operation completes, ; T i , O j , operation-end , U; is logged, where U contains information needed to perform a logical undo information. * If crash/rollback occurs before operation completes: * the operatio n-end log record is not found, and * the physical undo information is used to undo operation. * If crash/rollback occurs after the operation completes: * the operation-end log record is found, and in this case * logical undo is performed using U ; the physical undo information for the operation is ignored. Redo of operation (after crash) still uses physical redo information . 41. Advanced Recovery Techniques (Cont. ) * Rollback of transaction T i is done as follows: * Scan the log backwards * If a log record ; T i , X, V 1 , V 2 ; is found, perform the undo and log a special redo-only log record ; T i , X, V 1 ;. * If a ; T i , O j , operation-end , U ; record is found * Rollback the operation logically using the undo information U . * Updates performed during roll back are logged just like during normal operation execution. * At the end of the operation rollback, instead of logging an operation-end record, generate a record * ; T i , O j , operation-abort ;. Skip all preceding log records for T i until the record ; T i , O j operation-begin ; is found 42. Advanced Recovery Techniques (Cont. ) * Scan the log backwards (cont. ): * If a redo-only record is found ignore it * If a ; T i , O j , operation-abort ; record is found: * skip all preceding log records for T i until the record ; T i , O j , operation-begi n; is found. * Stop the scan when the record ; T i , start; is found * Add a ; T i , abort ; record to the log * Some points to note: * Cases 3 and 4 above can occur only if the database crashes while a transaction is being rolled back. Skipping of log records as in case 4 is important to prevent multiple rollback of the same operation. 43. Advanced Recovery Techniques(Cont,) * The following actions are taken when recovering from system crash * Scan log forward from last ; checkpoint L ; record * Repeat history by physically redoing all updates of all transactions, * Create an undo-list during the scan as follows * undo-list is set to L initially * Whene ver ; T i start ; is found T i is added to undo-list * Whenever ; T i commit ; or ; T i abort ; is found, T i is deleted from undo-list * This brings database to state as of crash, with committed as well as uncommitted transactions having been redone. Now undo-list contains transactions that are incomplete , that is, have neither committed nor been fully rolled back. 44. Advanced Recovery Techniques (Cont. ) * Recovery from system crash (cont. ) * Scan log backwards, performing undo on log records of transactions found in undo-list . * Transactions are rolled back as described earlier. * When ; T i start ; is found for a transaction T i in undo-list , write a ; T i abort ; log record. * Stop scan when ; T i start ; records have been found for all T i in undo-list * This undoes the effects of incomplete transactions (those with neither commit nor abort log records). Recovery is now complete. 45. Advanced Recovery Techniques (Cont. ) * Checkpointing is done as follows: Output all log records in memory to stable storage * Output to disk all modified buffer blocks * Output to log on stable storage a ; checkpoint L ; record. * Transactions are not allowed to perform any actions while checkpointing is in progress. * Fuzzy checkpointing allows transactions to progress while the most time consuming parts of checkpointing are in progress * Performed as described on next slide 46. Advanced Recovery Techniques (Cont. ) * Fuzzy checkpointing is done as follows: * Temporarily stop all updates by transactions * Write a ; checkpoint L ; log record and force log to stable storage * Note list M of modified buffer blocks Now permit transactions to proceed with their actions * Output to disk all modified buffer blocks in list M * blocks should not be updated while being output * Follow WAL: all log records pertaining to a block must be output before the block is output * Store a pointer to the checkpoint record in a fixed position last _ checkpoint on disk * When recovering usin g a fuzzy checkpoint, start scan from the checkpoint record pointed to by last _ checkpoint * Log records before last _ checkpoint have their updates reflected in database on disk, and need not be redone. * Incomplete checkpoints, where system had crashed while performing checkpoint, are handled safely 47. ARIES Recovery Algorithm 48. ARIES * ARIES is a state of the art recovery method * Incorporates numerous optimizations to reduce overheads during normal processing and to speed up recovery * The â€Å"advanced recovery algorithm† we studied earlier is modeled after ARIES, but greatly simplified by removing optimizations * Unlike the advanced recovery lgorithm, ARIES * Uses log sequence number (LSN) to identify log records * Stores LSNs in pages to identify what updates have already been applied to a database page * Physiological redo * Dirty page table to avoid unnecessary redos during recovery * Fuzzy checkpointing that only records information about dirty pages, and does not require dirty pages to be written out at checkpoint time * More coming up on each of the above †¦ 49. ARIES Optimizations * Physiological redo * Affected page is physically identified, action within page can be logical * Used to reduce logging overheads * e. g. hen a record is deleted and all other records have to be moved to fill hole * Physiological redo can log just the record deletion * Physical redo would require logging of old and new values for much of the page * Requires page to be output to disk atomically * Easy to achieve with hardware RAID, also supported by some disk systems * Incomplete page output can be detected by checksum techniques, * But extra actions are required for recovery * Treated as a media failure 50. ARIES Data Structures * Log sequence number (LSN) identifies each log record * Must be sequentially increasing * Typically an offset from beginning of log file to allow fast access * Easily extended to handle multiple log files Each page contains a PageLSN which is the LSN of the last log record whose effects are reflected on the page * To update a page: * X-latch the pag, and write the log record * Update the page * Record the LSN of the log record in PageLSN * Unlock page * Page flush to disk S-latches page * Thus page state on disk is operation consistent * Required to support physiological redo * PageLSN is used during recovery to prevent repeated redo * Thus ensuring idempotence 51. ARIES Data Structures (Cont. ) * Each log record contains LSN of previous log record of the same transaction * LSN in log record may be implicit Special redo-only log record called compensation log record (CLR) used to log actions taken during recovery that never need to be undone * Also serve the role of operation-abort log records used in advanced recovery algorithm * Have a field UndoNextLSN to note next (earlier) record to be undone * Records in between would have already been undone * Required to avoid repeated undo of already undone acti ons LSN TransId PrevLSN RedoInfo UndoInfo LSN TransID UndoNextLSN RedoInfo 52. ARIES Data Structures (Cont. ) * DirtyPageTable * List of pages in the buffer that have been updated * Contains, for each such page * PageLSN of the page RecLSN is an LSN such that log records before this LSN have already been applied to the page version on disk * Set to current end of log when a page is inserted into dirty page table (just before being updated) * Recorded in checkpoints, helps to minimize redo work * Checkpoint log record * Contains: * DirtyPageTable and list of active transactions * For each active transaction, LastLSN, the LSN of the last log record written by the transaction * Fixed position on disk notes LSN of last completed checkpoint log record 53. ARIES Recovery Algorithm * ARIES recovery involves three passes * Analysis pass : Determines Which transactions to undo * Which pages were dirty (disk version not up to date) at time of crash * RedoLSN : LSN from which redo should start * Redo pass : * Repeats history, redoing all actions from RedoLSN * RecLSN and PageLSNs are used to avoid redoing actions already reflected on page * Undo pass : * Rolls back all incomplete transactions * Transactions whose abort was complete earlier are not undone * Key idea: no need to undo these transactions: earlier undo actions were logged, and are redone as required 54. ARIES Recovery: Analysis * Analysis pass * Starts from last complete checkpoint log record Reads in DirtyPageTable from log record * Sets RedoLSN = min of RecLSNs of all pages in DirtyPageTable * In case no pages are dirty, RedoLSN = checkpoint record’s LSN * Sets undo-list = list of transactions in checkpoint log record * Reads LSN of last log record for each transaction in undo-list from checkpoint log record * Scans forward from checkpoint * .. On next page †¦ 55. ARIES Recovery: Analysis (Cont. ) * Analysis pass (cont. ) * Scans forward from checkpoint * If any log record found for transaction not in undo-list, adds transaction to undo-list * Whenever an update log record is found If page is not in DirtyPageTable, it is added with RecLSN set to LSN of the update log record * If transaction end log record found, delete transaction from undo-list * Keeps track of last log record for each transaction in undo-list * May be needed for later undo * At end of analysis pass: * RedoLSN determines where to start redo pass * RecLSN for each page in DirtyPageTable used to minimize redo work * All transactions in undo-list need to be rolled back 56. ARIES Redo Pass * Redo Pass: Repeats history by replaying every action not already reflected in the page on disk, as follows: * Scans forward from RedoLSN. Whenever an update log record is found: * If the page is not in DirtyPageTable or the LSN of the log record is less than the RecLSN of the page in DirtyPageTable, then skip the log record * Otherwise fetch the page from disk.If the PageLSN of the page fetched from disk is less than the LSN of the log record, redo the log record * NOTE: if either test is negative the effects of the log record have already appeared on the page. First test avoids even fetching the page from disk! 57. ARIES Undo Actions * When an undo is performed for an update log record * Generate a CLR containing the undo action performed (actions performed during undo are logged physicaly or physiologically). * CLR for record n noted as n ’ in figure below * Set UndoNextLSN of the CLR to the PrevLSN value of the update log record * Arrows indicate UndoNextLSN value * ARIES supports partial rollback * Used e. g. o handle deadlocks by rolling back just enough to release reqd. locks * Figure indicates forward actions after partial rollbacks * records 3 and 4 initially, later 5 and 6, then full rollback 1 2 3 4 4†² 3†² 5 6 5†² 2†² 1†² 6†² 58. ARIES: Undo Pass * Undo pass * Performs backward scan on log undoing all transaction in undo-list * Backward scan optimized by skipping unneeded log records as follows: * Next LSN to be undone for each transaction set to LSN of last log record for transaction found by analysis pass. * At each step pick largest of these LSNs to undo, skip back to it and undo it * After undoing a log record For ordinary log records, set next LSN to be undone for transaction to PrevLSN noted in the log record * For compensation log records (CLRs) set next LSN to be undo to UndoNextLSN noted in the log record * All intervening records are skipped since they would have been undo already * Undos performed as described earlier 59. Other ARIES Features * Recovery Independence * Pages can be recovered independently of others * E. g. if some disk pages fail they can be recovered from a backup while other pages are being used * Savepoints: * Transactions can record savepoints and roll back to a savepoint * Useful for complex transactions Also used to rollback just enough to release locks on deadlock 60. Other ARIES Features (Cont. ) * Fine-grained locking: * Index concurrency algorithms that permit tuple level locking on indices can be used * These require logical undo, rather than physical undo, as in advanced recovery algorithm * Recovery optimizations: For example: * Dirty page table can be used to prefetch pages during redo * Out of order redo is possible: * redo can be postponed on a page being fetched from disk, and performed when page is fetched. * Meanwhile other log records can continue to be processed 61. Remote Backup Systems 62. Remote Backup Systems Remote backup systems provide high availability by allowing transaction processing to continue even if the primary site is destroyed. 63. Remote Backup Systems (Cont. ) * Detection of failure : Backup site must detect when primary site has failed * to distinguish primary site failure from link failure maintain several communication links between the primary and the remote backup. * Transfer of control : * To take over control backup site first perform recovery using its copy of the database and all the long records it has received from the primary. * Thus, completed transactions are redone and incomplete transactions are rolled back. When the backup site takes over processing it becomes the new primary * To transfer control back to old primary when it recovers, old primary must receive redo logs from the old backup and apply all updates locally. 64. Remote Backup Systems (Cont. ) * Time to recover : To reduce delay in takeover, backup site periodically proceses the redo log records (in effect, performing recovery from previous database state), performs a checkpoint, and can then delete earlier parts of the log. * Hot-Spare configuration permits very fast takeover: * Backup continually processes redo log record as they arrive, applying the updates locally. When failure of the primary is detected the backup rolls back incomplete transactions, and is ready to process new transactions. * Alternative to remote backup: distributed dat abase with replicated data * Remote backup is faster and cheaper, but less tolerant to failure * more on this in Chapter 19 65. Remote Backup Systems (Cont. ) * Ensure durability of updates by delaying transaction commit until update is logged at backup; avoid this delay by permitting lower degrees of durability. * One-safe: commit as soon as transaction’s commit log record is written at primary * Problem: updates may not arrive at backup before it takes over. Two-very-safe: commit when transaction’s commit log record is written at primary and backup * Reduces availability since transactions cannot commit if either site fails. * Two-safe: proceed as in two-very-safe if both primary and backup are active. If only the primary is active, the transaction commits as soon as is commit log record is written at the primary. * Better availability than two-very-safe; avoids problem of lost transactions in one-safe. 66. End of Chapter 67. Block Storage Operations 68. Portion of t he Database Log Corresponding to T 0 and T 1 69. State of the Log and Database Corresponding to T 0 and T 1 70. Portion of the System Log Corresponding to T 0 and T 1 71. State of System Log and Database Corresponding to T 0 and T 1

Thursday, November 7, 2019

Logical Fallacies essayEssay Writing Service

Logical Fallacies essayEssay Writing Service Logical Fallacies essay Logical Fallacies essayThe argument chosen for this analysis is named Guns in America: freedom from the fear of firearms by Chad R. MacDonald. The issue considered in this post is the recent shooting in Fort Hood and the debate on whether guns should be legally allowed or prohibited in America. The premises of the argument listed by MacDonald are the following: the gun violence problem in the United States is worsening; the rate of mass shootings is alarming; the majority of Americans are willing to have gun control laws back (MacDonald, 2014). The author considers different points of view and facts, and comes to the conclusion that only responsible people who are mature enough should be allowed to own a gun.There are numerous logical fallacies in MacDonalds argument. He mostly uses inductive reasoning and makes generalizations basing on individual cases. However, his reasoning is weak and in most cases where MacDonald uses generalizations, they are logically incorrect. For example, MacDonald states that the gun violence problem worsens and seems to support his conclusion by a link to the news article named Mass shooting in America: a historical review. However, this article only contains statistics showing that the number of mass shootings in the 1990s was 42 and in the 2000s it declined to 28. During 2010-2013, there were 14 mass shootings, but this statistics does not provide enough information to state that gun violence issues really became worse; the article rather presents facts and situations in a way which is alarming to the reader.Furthermore, MacDonald makes numerous hasty generalizations in the post. For example, he attacks the argument of Wayne LaPierre (The only way to stop a bad guy with a gun is a good guy with a gun!) by considering a case when two officers took down a gunman after the gunman killed his victim. In the considered case, nine people were wounded before the gunman was taken down. MacDonald uses a sole example to state that LaPierres statement was fallacious: This is the absolute best-case scenario of LaPierre’s fallacious statement (MacDonald, 2014). Furthermore, MacDonald uses this case to generalize that other gun owners are going to do even worse that the police, and uses an argument to the person right after that: If you imagine that you are a good guy with a gun then you are neither responsible, nor mature enough to be handling one (MacDonald, 2014).Although the author cites a lot of sources and facts so that it seems that his arguments are logical and consequent, his arguments and conclusions are mostly fallacious. MacDonald uses many ambiguous and incorrect premises such as the worsening of gun violence issue which is not supported by reliable statistical data but is accompanied by an infographic with questionable sources. Furthermore, MacDonalds conclusions are not related to his arguments: he considers the arguments of gun proponents, the questions of freedom and responsibility, law enforcement , gun lobbies and John Lotts research, but it is not clear for the reader how MacDonalds conclusions are related to the subsections of the post. Basing on the above-mentioned facts, it is possible to label the argument of MacDonald as weak.

Monday, November 4, 2019

Affirmative Action Affirmative Action Essays

Affirmative Action Affirmative Action Essays The lawsuit sent shockwaves across the nation. Though the case centered on college admission practices, affirmative action plays a role in many everyday matters, especially towards procedures regarding employment. Before delving into discourse and opinions, the background and history of affirmative action should be discussed.   Ã‚  Ã‚  Ã‚  Ã‚  According to writer Stephen Cahn, affirmative action’s origins stem from an executive order that John F. Kennedy wrote in regards to the hiring practices of employers. Cahn writes that the President’s Committee on Equal Opportunity Employment stated federal contractors â€Å"...will not discriminate against any employee or applicant for employment because of race, creed, color, or national origin. The Contractor will take affirmative action, to ensure that applicants are employed, and that employees are treated during employment, without regard to their race, creed, color, or national origin." The principle of this order from President Kennedy was more developed with the Civil Rights Act of 1964, which in part stated that "No person in the United States shall, on the grounds of race, color or national origin, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving federa l financial assistance." About one year later, President Lyndon B. Johnson defined the concept of affirmative action emphasizing that civil rights laws alone were not enough to resolve discrimination. Just months later, President Johnson issued an order to enforce affirmative action toward prospective minority employees in all aspects of hiring and employment. Employers must take specific measures to ensure equality in hiring and must document these efforts.

Saturday, November 2, 2019

An Analysis of the Common Theme of Physical Violence in A Short Essay

An Analysis of the Common Theme of Physical Violence in A Short Account of the Destruction of the Indies and Titus Andronicus - Essay Example "A Short Account of the Destruction of the Indies" depicts colonization and Europeanization of the American Indians by Spaniards. Euroamerican settlement of the West accelerated, the government abandoned gradualism in favor of comprehensive programs for assimilation. The savage, noble or ignoble, was judged capable of civilization. Those who embraced it would be welcomed into mainstream society. Those who balked would nevertheless be compelled to behave. In a short time, no more than a generation, the old ways would die out. The savage would disappear with the passing of the frontier. Instead of a geographical expression, the West became, in the imaginations of Americans stranded in the cities and towns, a wild region inhabited by even wilder humans, some white and brown, but most red. Casas depicts: "Guacanagari himself died up in the mountains, broken and destitute, after he had fled to escape the massacres and the cruelty inflicted by the Spaniards, and all the other local leaders who owed allegiance to Guacanagar perished " (20). The kind of historical criticism which has laid itself most open to attack has based its conclusions on limited data and unwarranted assumptions; thus it has been essentially unhistorical. Shakespeare He shapes the character of Aaron as an independent force of evil, rather than as a mere agent of the queen. He introduces the parallel with Ovid's tale of Philomela, and he adds the final triumph of justice and order with the return of Lucius to Rome, in spite of the inconsistency which this involves, for there is no reason for a Goth army to serve Lucius against their own queen. Shakespeare also makes of Marcus a virtual chorus to comment upon the action as the play unfolds. His most important innovation is in his conception of the principal characters and their relations to one another. Titus Andronicus is a commanding figure. He is a great and initially virtuous man, the first of Shakespeare's heroic figures whose very virtues are the source of their sins. In many ways he is a forerunner of Coriolanus. Titus embodies all the ancient Roman virtues: 'A nobler man, a braver warrior, / Lives not this day within the city walls' (I.i.25-26). He has given his life and his so ns unselfishly in the cause of his country. He might now be emperor, but he respects hereditary right and chooses Satuminus instead. He is stern and he is proud, the master of his family, the last of the ancient Romans. In contrast to heroes themes presented by Shakespeare, A Short Account of the Destruction of the Indies creates a negative image of the Spanish colonizers and cruelties committed by them against peaceful population. In their thoughts about the West and its original populations, Spanish colonizers variously imagined an Indian to be a noble savage, a rapacious killer, a reservation idler, the vanishing American, or a war-bonneted equestrian raider of the plains. The last image proved to be the most persuasive and, given Indian portrayals in motion pictures and television