News

How did people cut their nails in the past?

How did people cut their nails in the past?

How did people cut their toe and fingernails before we got nail clippers after the industrial revolution?

Would they have had a fair amount of ingrown toenails, especially in the colder regions where feet had to be kept wrapped up?


With a paring knife. That's why nail parings are called, well, nail parings.

Also, there were nippers similar to modern yarn cutters which were in common use since Roman times. Yarn cutters look like this:


Scraping the tip of your nails across a pocket knife blade. My grandfather does it all the time. It's similar to how some nail files work.


The Tale of Old Nails

This is a piece of 1/8-inch square rolled iron nail stock that was used by a &ldquonailer&rdquo to hammer out a handmade nail.

One of the key ingredients in the process of determining the age of a piece of older or antique furniture is how the wood is assembled to produce this functional work of art. Drawers are typically put together using various methods of wood joinery, i.e. dovetails, scallop joints or rabbets. Older case goods generally employ mortise and tenon joints, as do old chairs and doors. But the most straightforward method of all construction techniques is the use of a fastener, an external device that holds two pieces of wood together without additional shaping of the wood and the simplest fastener is a nail&mdashin essence a tapered metal dowel inserted by the brute force of a hammer blow.

Nails, of course, have been around for thousands of years, but their general application to furniture making is fairly recent. Until modern times all nails were hand made, one at a time by a blacksmith or a specialist, called a &ldquonailer.&rdquo But since nails are such useful items, not just for furniture but for general building applications, it is not surprising that some of the first modern machinery was devoted to the manufacture of nails.

These are hand-made iron nails from the 18th century. Note the &ldquorosehead&rdquo hammered head and the sharp point.

In the American Colonies, one of the early industries to be well established, after glassmaking and spirits distilling, was the nail stock business. Up and down the East coast as early as the late 17th century, rolling mills turned out long, thin, square pieces of iron called nail stock, to be sent to the local nailer.

The nailer then heated a section of the stock and pounded out a point on all four sides. After cutting to length the section was inserted in hole on the anvil called a &ldquoswage&rdquo block and the head of the nail formed by repeated blows to the top of the nail, giving it the &ldquorosehead&rdquo look we identify with hand made nails. A lot of work for just one nail.

But this method had its rewards. The pounding of the nail to shape it made the iron denser and thus more water resistant and durable, as well as malleable (bendable). This malleability was one of the key factors in the success of the handmade nail it was so flexible that as it was driven into a piece of wood it followed the internal grain pattern, often in an arc, and thus provided a clinching effect that help hold the nailed joint very tightly. The hand-wrought iron rosehead nails leave a very identifiable clue&mdasha square hole&mdashwhen they are removed from wood. No other type of nail leaves a square hole.

By the early 1800&rsquos, nail cutting machines were in general use in America. These early machines cut angular strips from a thin sheet of metal resulting in a nail with two parallel sides, representing the thickness of the sheet of metal, and two cut angular sides forming the point. The heads still had to be hammered by hand and these nails are easily confused with hand-wrought nails because they both have hand hammered rose-like heads. The difference is in the shape of the hole. The machine made nails leave rectangular holes which are easily distinguished from the square marks of the earliest nails. This type of nail is the kind frequently found in early 19th century Federal and American Empire furniture and just as frequently misidentified as hand wrought.

These nails were all cut from a sheet of iron. The top nail with the &ldquonotch&rdquo head is from the early 19th century. The middle nail with the rectangular flat head is from around 1830/1840.

Another type of early nail merely had a notch as the head. This wasn&rsquot very effective but it was quick and cheap and machine cut nails became a staple of both the construction industry and the furniture building trade. An even better nail came around 1830. The machines by now were producing nails that actually had flattened protruding surfaces to function as the head. These were made by a single, forceful impact on the top of the nail by the machinery itself and no human work was required. As erratic and small as these new heads were, they were still the best yet.

By the 1840&rsquos, the nail making technology settled down to making the best cut nail yet. This mid-century nail had a large, uniform, machine-made head and it became the standard nail for more than 50 years, and it continued to leave the characteristic rectangular hole. These nails are ones found in late Classicism (C-scroll Empire) and Victorian furniture throughout the rest of the 19th century. As good as these nails were however, they did have a drawback. They did not benefit from the hand pounding reserved for the making of hand wrought nails and thus were more brittle than earlier nails. This stiffness meant that they did not have the same internal clinching power as their predecessors and tended to snap off under duress rather than bend.

This is the standard wire nail first introduced around 1880.

Around 1880 came the next major leap in nail development. A machine was invented that produced a round nail drawn from a piece of steel wire and formed with a perfectly circular, stamped head and a sharp, cut point. This does not mean that all cabinet shops instantly stopped using cut nails when the new style showed up. Cut nails continued to be used early into the 20th century until existing stocks were used up. And hand-wrought nails continued to be made throughout the 19th century for certain specialty applications, such as gate building and other instances where the benefits of the clinching nail outweighed the cost of hand production.

But in the end the round wire nail became the universal standard and still is today. It represents a technology that is still in use and virtually unchanged for more than 100 years quite a rarity at the beginning of the 21st century.

Even if the nail itself is missing in a piece of furniture, you can sometimes determine its origin by the hole it leaves. Handmade nails leave square holes, cut nails leave rectangular holes and wire nails leave round holes.

Each type of nail leaves its signature hole.

Fred Taylor is a Worthologist who specializes in American furniture from the Late Classicism period (1830-1850).


Mary E. Cobb first learned the art of the manicure in France. She then redeveloped the process and brought it to the United States. In 1878, Cobb opened the first-ever nail salon titled "Mrs. Pray's Manicure." Little did she know, it would be one of the most popular and most requested beauty services in history.

She then went on to open the first manicure parlor in America, along with developing her line of products and creating the very first at-home manicure guide. And she didn't stop there her most significant contribution to the industry was inventing the emery board.


Ancient warriors painted their nails before combat to instil fear among the enemy

They say that you can tell a lot about a person by the look of their nails. History has proven this through centuries, as nails have borne social, political or mythic significance across cultures and civilizations.

It all started with the Ancient Babylonians. Intriguingly enough, it was men, not women, who started polishing their nails. It is assumed that Babylonian soldiers painted their nails green and black before combat.

They believed that the look of their war-painted nails would instil fear among adversaries. Archaeological evidence of a solid gold manicure set was unearthed in Southern Babylonia dating 3, 200 BC, as a part of a combat equipment.Similar to the Ancient Babylonians, the Inca people received nail treatments before going into battle.

Assyrian Soldier with Standing Shield, Soldier with Small Shield, Archer

The first nail treatments as a part of a beauty routine utilized by women commenced in Ancient China. Around 3000 BC, women would soak their nails overnight in a mixture of beeswax, gelatin, and egg whites. Natural dyes made of orchids and roses were also applied on the nails.

The purpose of manicures went a step further in Ancient China around 600 BC, when the color of someone’s nails signified their social ranking.

The Chinese would paint their nails in the colors of the ruling dynasty. During Chou dynasty, gold and silver nails represented the highest social ranking. People who were of the highest class, painted their nails black and red, symbolizing strength and boldness. Ordinary folks were forbidden to wear the colors which were worn by the representatives of the higher social ranks and were allowed to wear only pale colored nails to affirm their inferiority.

Ancient Egyptians utilized the abundance of henna plant, which originated in Egypt for medicine, dying cloth, leather and even animal fur. According to various Ancient Egyptian accounts, women used henna to polish their nails.

A pair of fingernail protectors, front- side- and back- view brass sheet hammered, inlaid semi-precious stones. China, Qing Dynasty, 1900-1910 Photo Credit

It is a common notion that Cleopatra and Nefertiti were trendsetters in their time, and it is believed that it was Queen Nefertiti who first dyed her nails red as a symbol of her royal status.

The stronger the red shade was, the more power the person possessed. Queen Nefertiti usually wore ruby-red nails, dyed with henna, but some sources suggest that she also used blood as a coloring agent.

Left: Picture of the Nefertiti bust in Neues Museum, Berlin. Photo Credit Right: Red Nail Polish Photo Credit

Cleopatra also wore red nails during her rule of Egypt, and nobody else was allowed to dye their nails red. Even today, red nails represent elegance and sophistication.

While we may think that nail art is a modern concept, it was the Inca, in the 1500s, who first started to decorate their fingernails with images of eagles.

Ex-servicewomen learning manicure techniques during a retraining course on beauty parlor operation at the Robertson Hairdressing School, Apr 1945 Photo Credit

The first commercial nail salons opened in Paris during the 19th century. These salons are similar to the modern nail treatment service. Beauty Salon keepers offered various services to men and women interested in fashion nails were handled with different creams, oils, and powders which cleaned and polished them to shine. From France, the art of nail treating was transferred to the United States.

Mary E. Cobb was the first well-known American manicurist that learned the technique from the French and introduced the service to the States. Her first manicure salon opened in Manhattan in 1878 and was known as “Mrs. Pray’s Manicure”.

Cobb slightly changed the traditional French way of doing nails, and her methods embraced a multi-step process of soaking the fingers, carefully trimming the nails, and then shaping and coloring the nails. Her business later contributed the invention of the memory board.

This might come a bit of a shock, but the invention of high-gloss car paint influenced the creation of the modern-day nail polish known and highly appreciated today.

Cutex advertisement of 1924

Michelle Manard, a French makeup artist, came up with the ingenious idea of adapting these car paints for use on nails. She made some alterations to the formula and created a glossy nail lacquer very similar to nail polish we use today.

Her idea was soon recognized by a goldmine, The Charles Revson Company. The owners began work on perfecting the formula, and with Manard’s original idea they developed non-streaking opaque nail polish. The company was soon renamed to Revlon and started selling the first modern nail polish.

Rita Hayworth from the trailer for the film Blood and Sand

The introduction of Technicolor in 1922, stirred up the trends. Moviegoers were dazzled by the colors, and glamor that Hollywood oozed with at the time. Then, Rita Hayworth appeared with stunning red lips and matching nails and women were enthralled. It seemed like every woman at the time was wearing red nails to look like “Gilda.” Revlon, of course, capitalized on this trend and created an extensive line of nail polishes for any taste.

Red nails were all the rage throughout the 50s. However, with the beginning of the 60s counterculture, the “statement color nails” were replaced with natural, pastel shades. In the 70s, actresses like Mia Farrow, Farrah Fawcett, and Goldie Hawn contributed to actualizing the more natural shades.

In 1976, the American makeup artist, Jeff Pink was working with stars in Hollywood and was challenged to come up with a nail color solution that could match each outfit of the stars. He eventually invented the “French manicure,” the treatment that gives a natural look to the nails.

In the 1980s, with the rise of the soap opera shows like Dynasty and Dallas, statement colors like fuschia and bright red made a big comeback.

Manicure shop in Albergo Diurno Venezia in Milan 1996 Photo Credit

The 1990s were all about Chanel’s Rouge Noir/Vamp. When Uma Thurman appeared in the cult classic Pulp Fiction as Mia Wallace, the vamp nail polish flew off the shelves.


How did people cut their nails before the nail clipper was invented? Did they bite them? If so why is it now considered a bad habit?

To give you an idea of what kind of instruments were used, this object from the St. Albans Museum is listed as a replica of an ancient Roman nail-care kit, and it includes something that looks very close in form and function to a modern nail clipper. (EDIT: But which actually is not, see my below discussion with kermityfrog.)

I can find scattered references to Egyptian nail care tools, but nothing as solid as that. Getting earlier than Egypt and Greece, weɽ be looking at Mesopotamia. I can find no specific references to nail-care tools of that region's civilizations, though some of their myths have characters removing dirt from beneath their fingernails with no mention of a specific tool. And it looks like it was customary to impress an unfired clay tablet with your fingernail as a kind of signature, so for at least some of the ancient Mesopotamian civilizations, for at least some time, it seems to have been customary to leave your nails long.

Oh, and I found a lot of modern sources offhandedly mentioning that manual laborers' nails wear down and break on their own, and 'manual laborer' describes an increasingly overwhelming majority of the population as you get further and further back in history. I can't give you anything so academically rigorous as a primary ancient-world source that directly says that, but have a photograph of the hands of a modern Bangladeshi rice farmer's wife.

So it looks like quasi-modern nail clippers are about as old as documented traditions of nail-cutting. Unless there's a cave painting somewhere out there that tells the story of The First Manicure. (EDIT: Again, I seem to be mistaken here. The Romans used a specialized sort of small knife: see this discussion for further info.)

EDIT: The thingum on the right is supposedly a nail trimmer from the Hallstatt culture, which is about as old as Rome. Don't really know how it works. I've also found references to ancient Egyptian nail scissors, though nothing solid enough to cite.

The caption for the St. Albans Museum manicure set says that the nail clipper-like things are tweezers for pulling out hair.

I'm dying to ask a question, which is relevant - but not worth opening a new thread for.

How did Barbarians and less advanced cultures keep groomed. For example, beards, hair, pubic regions? Did they have a crude type of razor, or did they let it grow?

This is interesting from the perspective of ancient Egypt. As context, we know that the Egyptians were extremely fussy about cosmetics, clothing, health, hygiene, etc. So bear this in mind and perhaps not generalise to the rest of the ancient world.

In early Pharaonic Egypt, we have the title ir.w an.t, translated roughly as 'manicurist', although they did feet too. You would be the manicurist alongside other things - some of which were also health-related, others were administrative and political. People with the title of manicurist were all male as far as we know. We know that being the manicurist of the King was an extremely prestigious title - people who held the title had other high offices, assumedly because they had physical (and therefore political) access to the king. Want to assassinate the king? Get to the guy who can touch him. Want to get your idea passed? Get to the guy who speaks to him while he's relaxing.

The earliest example of the title comes from the early Old Kingdom (around 2686 BCE), although a famous example is the tomb of Niankhkhnum and Khnumhotep.

For sources on the title, see Jones, Titles of the Old Kingdom, no. 1121 onwards.

Obviously not everybody had a manicurist, I don't know if this is ever discussed in the scholarship, but surely the super-elite (royalty and high officials) would have manicurists on-hand, the elite and upper-class would visit them as needed, and the middle class might go if there was a problem needing attending to.

So how was this done? Well, there are scenes of manicures and pedicures from the Old Kingdom. As you can see, youɽ get both done by the same person. Source for this and other scenes is here. There are scenes of people sitting on the floor (looks more like a chore?) while getting it done as well as scenes of people sitting on chairs (more relaxing perhaps?) while getting it done.

Archaeology

The manicurist - and assumedly people doing it to themselves - would use a variety of tools. There's an issue here with archaeological remains because a) poorer people would have used simple flint tools (and how do you know a flint tool was a nail-cutter and not a something-else-cutter?), and b) metal remains aren't hugely prominent in the archaeological record from Egypt because of corrosion and value (to thieves and family). We have some cosmetic sets from Egypt that have simple spike things, and we assume that they would have been used as necessary to cut nails, do things with hair, clean bits of the body, and so on.

If you've ever needed to quickly cut your nails, you'll know that they're actually not particularly hard and can be cut with lots of things. I assume it would have been the same in Egypt. There is a large body of cosmetic spoons from Egypt, and although many could have only been used to scoop precious oil etc from little pots as they relatively thin, others seem extremely sturdy and might have been used as a multi-tool for cosmetics.

The second thing to remember - and I'm not sure if this has been brought up, I apologise if it has - is that if you're doing lots of manual labor, you don't really need to cut your nails as much. We often forget this in the modern world, but if you're doing anything tough with your hands, your nails get worn down from general use as well as using your nails themselves as a tool. If you're a potter in ancient Egypt, for example, you use your finger tips and nails to create decorations.

Biting nails

Interesting question about the modern taboo against biting nails. I don't know if this is limited to the west, but I agree that it's certainly frowned upon in my area of the world. There isn't any evidence about this from Egypt, though, as far as I'm aware - and there is evidence about social etiquette. It's impossible to know either way, but given there was a profession dedicated to the activity, it was assumedly better to do it properly (by somebody or yourself in private) than improperly (publicly by biting them, picking them unless it was required, etc?). I'm speculating now, though, however remember that the product of cutting nails is essentially garbage/waste, so it seems logical that having that in your mouth goes against the general disgust-taboo across all cultures.


How You Can Protect Your Teeth From Infections and Cavities

Brushing and Flossing — when you brush your teeth, you remove the layer of dental plaque that adheres to your teeth and accumulates from eating all day. Brushing away the plaque protects your teeth from harmful bacteria inside the plaque. Similarly, flossing between your teeth will ensure that each and every corner is stripped of harmful plaque.

Eat a Balanced Diet — a diet that is rich in tooth-friendly nutrients, minerals and vitamins will make your teeth stronger and protect them from cavities. It’s also a good idea to lessen the intake of sugary foods and sodas. Remember to rinse your mouth after eating or drinking anything sweet or acidic.

Visit Your Dentist Regularly — visiting the dentist regularly will ensure that any developing problems are diagnosed and treated at their earliest. This will lessen the likelihood of a small problem causing permanent damage to the teeth or the oral cavity.

Remember, your dental health has a direct influence on your physical wellbeing. If you want to remain healthy, look after your teeth!


Contents

Self-harm (SH), also referred to as self-injury (SI), self-inflicted violence (SIV), nonsuicidal self injury (NSSI) or self-injurious behaviour (SIB), are different terms to ascribe behaviours where demonstrable injury is self-inflicted. [25] The behaviour involves deliberate tissue damage that is usually performed without suicidal intent. The most common form of self-harm involves cutting of the skin using a sharp object, e. g. a knife or razor blade. The term self-mutilation is also sometimes used, although this phrase evokes connotations that some find worrisome, inaccurate, or offensive. [25] Self-inflicted wounds is a specific term associated with soldiers to describe non-lethal injuries inflicted in order to obtain early dismissal from combat. [26] [27] This differs from the common definition of self-harm, as damage is inflicted for a specific secondary purpose. A broader definition of self-harm might also include those who inflict harm on their bodies by means of disordered eating.

The older literature has used several different terms. For this reason research in the past decades has inconsistently focused on self-harming behavior without and with suicidal intent (including suicide attempts) with varying definitions leading to inconsistent and unclear results. [2]

Nonsuicidal self-injury (NSSI) has been listed as a proposed disorder in the DSM-5 under the category "Conditions for Further Study". [28] It is noted that this proposal of diagnostic criteria for a future diagnosis is not an officially approved diagnosis and may not be used for clinical use but is meant for research purposes only. [28] The disorder is defined as intentional self-inflicted injury without the intent of committing suicide. Criteria for NSSI include five or more days of self-inflicted harm over the course of one year without suicidal intent, and the individual must have been motivated by seeking relief from a negative state, resolving an interpersonal difficulty, or achieving a positive state. [29]

A common belief regarding self-harm is that it is an attention-seeking behaviour however, in many cases, this is inaccurate. Many self-harmers are very self-conscious of their wounds and scars and feel guilty about their behaviour, leading them to go to great lengths to conceal their behaviour from others. [8] They may offer alternative explanations for their injuries, or conceal their scars with clothing. [30] [31] Self-harm in such individuals may not be associated with suicidal or para-suicidal behaviour. People who self-harm are not usually seeking to end their own life it has been suggested instead that they are using self-harm as a coping mechanism to relieve emotional pain or discomfort or as an attempt to communicate distress. [12] [13] Alternatively, interpretations based on the supposed lethality of a self-harm may not give clear indications as to its intent: seemingly superficial cuts may have been a suicide attempt, whereas life-threatening damage may have been done without the intent to die. [ citation needed ]

Studies of individuals with developmental disabilities (such as intellectual disability) have shown self-harm being dependent on environmental factors such as obtaining attention or escape from demands. [32] Some individuals may have dissociation harboring a desire to feel real or to fit into society's rules. [33]

Eighty percent of self-harm involves stabbing or cutting the skin with a sharp object, sometimes breaking through the skin entirely. [8] [34] [35] However, the number of self-harm methods are only limited by an individual's inventiveness and their determination to harm themselves this includes burning, self-poisoning, alcohol abuse, self-embedding of objects, hair pulling, bruising/hitting one's self, scratching to hurt one's self, knowingly abusing over the counter or prescription drugs, and forms of self-harm related to anorexia and bulimia. [8] [35] The locations of self-harm are often areas of the body that are easily hidden and concealed from the detection of others. [36] As well as defining self-harm in terms of the act of damaging the body, it may be more accurate to define self-harm in terms of the intent, and the emotional distress that the person is attempting to deal with. [35] Neither the DSM-IV-TR nor the ICD-10 provide diagnostic criteria for self-harm. It is often seen as only a symptom of an underlying disorder, [12] though many people who self-harm would like this to be addressed. [31] Common signs that a person may be engaging in self-harm include the following: they ensure that there are always harmful objects close by, they are experiencing difficulties in their personal relationships, their behaviour becomes unpredictable, they question their worth and identity, they make statements that display helplessness and hopelessness. [37]

Mental disorder Edit

Although some people who self-harm do not have any form of recognised mental disorder, [30] many people experiencing various forms of mental illnesses do have a higher risk of self-harm. The key areas of disorder which exhibit an increased risk include autism spectrum disorders, [38] [39] borderline personality disorder, dissociative disorders, bipolar disorder, [40] depression, [16] [41] phobias, [16] and conduct disorders. [42] Schizophrenia may also be a contributing factor for self-harm. Those diagnosed with schizophrenia have a high risk of suicide, which is particularly greater in younger patients as they may not have an insight into the serious effects that the disorder can have on their lives. [43] Substance abuse is also considered a risk factor [12] as are some personal characteristics such as poor problem-solving skills and impulsivity. [12] There are parallels between self-harm and Münchausen syndrome, a psychiatric disorder in which individuals feign illness or trauma. [44] There may be a common ground of inner distress culminating in self-directed harm in a Münchausen patient. However, a desire to deceive medical personnel in order to gain treatment and attention is more important in Münchausen's than in self-harm. [44]

Psychological factors Edit

Abuse during childhood is accepted as a primary social factor increasing the incidence of self-harm, [45] as is bereavement, [46] and troubled parental or partner relationships. [12] [17] Factors such as war, poverty, and unemployment may also contribute. [16] [47] [48] Other predictors of self-harm and suicidal behavior include feelings of entrapment, defeat, lack of belonging, and perceiving oneself as a burden along with less effective social problem-solving skills. [21] Self-harm is frequently described as an experience of depersonalisation or a dissociative state. [49] As many as 70% of individuals with borderline personality disorder engage in self-harm. [50] An estimated 30% of individuals with autism spectrum disorders engage in self-harm at some point, including eye-poking, skin-picking, hand-biting, and head-banging. [38] [39] The onset of puberty has also been shown to be the onset of self-harm including the onset of sexual activity this is because the pubertal period is a period of neurodevelopmental vulnerability and comes with an increased risk of emotional disorders and risk-taking behaviors. [21]

Genetics Edit

The most distinctive characteristic of the rare genetic condition, Lesch–Nyhan syndrome, is self-harm and may include biting and head-banging. [51] Genetics may contribute to the risk of developing other psychological conditions, such as anxiety or depression, which could in turn lead to self-harming behaviour. However, the link between genetics and self-harm in otherwise healthy patients is largely inconclusive. [7]

Drugs and alcohol Edit

Substance misuse, dependence and withdrawal are associated with self-harm. Benzodiazepine dependence as well as benzodiazepine withdrawal is associated with self-harming behaviour in young people. [52] Alcohol is a major risk factor for self-harm. [34] A study which analysed self-harm presentations to emergency rooms in Northern Ireland found that alcohol was a major contributing factor and involved in 63.8% of self-harm presentations. [53] A recent study in the relation between cannabis use and deliberate self-harm (DSH) in Norway and England found that, in general, cannabis use may not be a specific risk factor for DSH in young adolescents. [54] Smoking has also been associated with self-harm in adolescents one study found that suicide attempts were four times higher for adolescents that smoke than for those that do not. [21] A more recent meta-analysis on literature concerning the association between cannabis use and self-injurious behaviours has defined the extent of this association, which is significant both at the cross-sectional (odds ratio = 1.569, 95% confidence interval [1.167-2.108]) and longitudinal (odds ratio = 2.569, 95% confidence interval [2.207-3.256]) levels, and highlighting the role of the chronic use of the substance, and the presence of depressive symptoms or of mental disorders as factors that might increase the risk of committing self-harm among cannabis users. [55]

Self-harm is not typically suicidal behaviour, although there is the possibility that a self-inflicted injury may result in life-threatening damage. [56] Although the person may not recognise the connection, self-harm often becomes a response to profound and overwhelming emotional pain that cannot be resolved in a more functional way. [8]

The motivations for self-harm vary, as it may be used to fulfill a number of different functions. [14] These functions include self-harm being used as a coping mechanism which provides temporary relief of intense feelings such as anxiety, depression, stress, emotional numbness and a sense of failure or self-loathing. There is also a positive statistical correlation between self-harm and emotional abuse. [16] [17] Self-harm may become a means of managing and controlling pain, in contrast to the pain experienced earlier in the person's life of which they had no control over (e.g., through abuse). [56]

Other motives for self-harm do not fit into medicalised models of behaviour and may seem incomprehensible to others, as demonstrated by this quotation: "My motivations for self-harming were diverse, but included examining the interior of my arms for hydraulic lines. This may sound strange." [31]

Assessment of motives in a medical setting is usually based on precursors to the incident, circumstances, and information from the patient. [12] However, limited studies show that professional assessments tend to suggest more manipulative or punitive motives than personal assessments. [57]

The UK ONS study reported only two motives: "to draw attention" and "because of anger". [16] For some people, harming themselves can be a means of drawing attention to the need for help and to ask for assistance in an indirect way. It may also be an attempt to affect others and to manipulate them in some way emotionally. [14] [56] However, those with chronic, repetitive self-harm often do not want attention and hide their scars carefully. [58]

Many people who self-harm state that it allows them to "go away" or dissociate, separating the mind from feelings that are causing anguish. This may be achieved by tricking the mind into believing that the present suffering being felt is caused by the self-harm instead of the issues they were facing previously: the physical pain therefore acts as a distraction from the original emotional pain. [30] To complement this theory, one can consider the need to "stop" feeling emotional pain and mental agitation. "A person may be hyper-sensitive and overwhelmed a great many thoughts may be revolving within their mind, and they may either become triggered or could make a decision to stop the overwhelming feelings." [59]

Alternatively, self-harm may be a means of feeling something, even if the sensation is unpleasant and painful. Those who self-harm sometimes describe feelings of emptiness or numbness (anhedonia), and physical pain may be a relief from these feelings. "A person may be detached from themselves, detached from life, numb and unfeeling. They may then recognise the need to function more, or have a desire to feel real again, and a decision is made to create sensation and 'wake up'." [59]

Those who engage in self-harm face the contradictory reality of harming themselves while at the same time obtaining relief from this act. It may even be hard for some to actually initiate cutting, but they often do because they know the relief that will follow. For some self-harmers this relief is primarily psychological while for others this feeling of relief comes from the beta endorphins released in the brain. [14] Endorphins are endogenous opioids that are released in response to physical injury, acting as natural painkillers and inducing pleasant feelings, and in response to self-harm would act to reduce tension and emotional distress. [2] Many self-harmers report feeling very little to no pain while self-harming [45] and, for some, deliberate self-harm may become a means of seeking pleasure.

As a coping mechanism, self-harm can become psychologically addictive because, to the self-harmer, it works it enables them to deal with intense stress in the current moment. The patterns sometimes created by it, such as specific time intervals between acts of self-harm, can also create a behavioural pattern that can result in a wanting or craving to fulfill thoughts of self-harm. [60]

Autonomic nervous system Edit

Emotional pain activates the same regions of the brain as physical pain, [61] so emotional stress can be a significantly intolerable state for some people. Some of this is environmental and some of this is due to physiological differences in responding. [62] The autonomic nervous system is composed of two components: the sympathetic nervous system controls arousal and physical activation (e.g., the fight-or-flight response) and the parasympathetic nervous system controls physical processes that are automatic (e.g., saliva production). The sympathetic nervous system innervates (e.g., is physically connected to and regulates) many parts of the body involved in stress responses. Studies of adolescents have shown that adolescents who self-injure have greater physiological reactivity (e.g., skin conductance) to stress than adolescents who do not self-injure. [63] [64] This stress response persists over time, staying constant or even increasing in self-injuring adolescents, but gradually decreases in adolescents who do not self-injure.

Several forms of psychosocial treatments can be used in self-harm including Dialectical behavior therapy. [65] Psychiatric and personality disorders are common in individuals who self-harm and as a result self-harm may be an indicator of depression and/or other psychological problems. [ citation needed ] Many people who self-harm have moderate or severe depression and therefore treatment with antidepressant medications may often be used. [66] There is tentative evidence for the medication flupentixol however, greater study is required before it can be recommended. [67]

Therapy Edit

Dialectical behavior therapy for adolescents (DBT-A) is a well-established treatment for self-injurious behaviour in youth and probably useful for decreasing the risk of non suicidal self injury. [65] Several other treatments including integrated CBT (I-CBT), attachment-based family therapy (ABFT), resourceful adolescent parent program (RAP-P), intensive interpersonal psychotherapy for adolescents (IPT-A-IN), mentalization-based treatment for adolescents (MBT-A), and integrated family therapy are probably efficacious. [65] [68] Cognitive behavioural therapy may also be used to assist those with Axis I diagnoses, such as depression, schizophrenia, and bipolar disorder. Dialectical behaviour therapy (DBT) can be successful for those individuals exhibiting a personality disorder, and could potentially be used for those with other mental disorders who exhibit self-harming behaviour. [68] Diagnosis and treatment of the causes of self-harm is thought by many to be the best approach to treating self-harm. [13] But in some cases, particularly in people with a personality disorder, this is not very effective, so more clinicians are starting to take a DBT approach in order to reduce the behaviour itself. People who rely on habitual self-harm are sometimes hospitalised, based on their stability, their ability and especially their willingness to get help. [69] In adolescents multisystem therapy shows promise. [70] Pharmacotherapy has not been tested as a treatment for adolescents who self-harmed. [21]

A meta-analysis found that psychological therapy is effective in reducing self-harm. The proportion of the adolescents who self-harmed over the follow-up period was lower in the intervention groups (28%) than in controls (33%). Psychological therapies with the largest effect sizes were dialectical behaviour therapy (DBT), cognitive-behavioural therapy (CBT), and mentalization-based therapy (MBT). [71]

In individuals with developmental disabilities, occurrence of self-harm is often demonstrated to be related to its effects on the environment, such as obtaining attention or desired materials or escaping demands. As developmentally disabled individuals often have communication or social deficits, self-harm may be their way of obtaining these things which they are otherwise unable to obtain in a socially appropriate way (such as by asking). One approach for treating self-harm thus is to teach an alternative, appropriate response which obtains the same result as the self-harm. [72] [73] [74]

Avoidance techniques Edit

Generating alternative behaviours that the person can engage in instead of self-harm is one successful behavioural method that is employed to avoid self-harm. [75] Techniques, aimed at keeping busy, may include journaling, taking a walk, participating in sports or exercise or being around friends when the person has the urge to harm themselves. [18] The removal of objects used for self-harm from easy reach is also helpful for resisting self-harming urges. [18] The provision of a card that allows the person to make emergency contact with counselling services should the urge to self-harm arise may also help prevent the act of self-harm. [76] Alternative and safer methods of self-harm that do not lead to permanent damage, for example the snapping of a rubber band on the wrist, may also help calm the urge to self-harm. [18] [ failed verification ] Using biofeedback may help raise self-awareness of certain pre-occupations or particular mental state or mood that precede bouts of self-harming behaviour, [77] and help identify techniques to avoid those pre-occupations before they lead to self-harm. Any avoidance or coping strategy must be appropriate to the individual's motivation and reason for harming. [78]

It is difficult to gain an accurate picture of incidence and prevalence of self-harm. [8] [79] This is due in a part to a lack of sufficient numbers of dedicated research centres to provide a continuous monitoring system. [79] However, even with sufficient resources, statistical estimates are crude since most incidences of self-harm are undisclosed to the medical profession as acts of self-harm are frequently carried out in secret, and wounds may be superficial and easily treated by the individual. [8] [79] Recorded figures can be based on three sources: psychiatric samples, hospital admissions and general population surveys. [80]

The World Health Organization estimates that, as of 2010, 880,000 deaths occur as a result of self-harm. [81] About 10% of admissions to medical wards in the UK are as a result of self-harm, the majority of which are drug overdoses. [46] However, studies based only on hospital admissions may hide the larger group of self-harmers who do not need or seek hospital treatment for their injuries, [12] instead treating themselves. Many adolescents who present to general hospitals with deliberate self-harm report previous episodes for which they did not receive medical attention. [80] In the United States up to 4% of adults self-harm with approximately 1% of the population engaging in chronic or severe self-harm. [82]

Current research suggests that the rates of self-harm are much higher among young people [8] with the average age of onset between 14 and 24. [1] [8] [9] [19] [20] The earliest reported incidents of self-harm are in children between 5 and 7 years old. [8] In the UK in 2008 rates of self-harm in young people could be as high as 33%. [83] In addition there appears to be an increased risk of self-harm in college students than among the general population. [34] [82] In a study of undergraduate students in the US, 9.8% of the students surveyed indicated that they had purposefully cut or burned themselves on at least one occasion in the past. When the definition of self-harm was expanded to include head-banging, scratching oneself, and hitting oneself along with cutting and burning, 32% of the sample said they had done this. [84] In Ireland, a study found that instances of hospital-treated self-harm were much higher in city and urban districts, than in rural settings. [85] The CASE (Child & Adolescent Self-harm in Europe) study suggests that the life-time risk of self-injury is

Sex differences Edit

In general, the latest aggregated research has found no difference in the prevalence of self-harm between men and women. [82] This is in contrast to past research which indicated that up to four times as many females as males have direct experience of self-harm. [12] However, caution is needed in seeing self-harm as a greater problem for females, since males may engage in different forms of self-harm (e.g., hitting themselves) which could be easier to hide or explained as the result of different circumstances. [8] [82] Hence, there remain widely opposing views as to whether the gender paradox is a real phenomenon, or merely the artifact of bias in data collection. [79]

The WHO/EURO Multicentre Study of Suicide, established in 1989, demonstrated that, for each age group, the female rate of self-harm exceeded that of the males, with the highest rate among females in the 13–24 age group and the highest rate among males in the 12–34 age group. However, this discrepancy has been known to vary significantly depending upon population and methodological criteria, consistent with wide-ranging uncertainties in gathering and interpreting data regarding rates of self-harm in general. [87] Such problems have sometimes been the focus of criticism in the context of broader psychosocial interpretation. For example, feminist author Barbara Brickman has speculated that reported gender differences in rates of self-harm are due to deliberate socially biased methodological and sampling errors, directly blaming medical discourse for pathologising the female. [88]

This gender discrepancy is often distorted in specific populations where rates of self-harm are inordinately high, which may have implications on the significance and interpretation of psychosocial factors other than gender. A study in 2003 found an extremely high prevalence of self-harm among 428 homeless and runaway youths (aged 16–19) with 72% of males and 66% of females reporting a history of self-harm. [89] However, in 2008, a study of young people and self-harm saw the gender gap widen in the opposite direction, with 32% of young females, and 22% of young males admitting to self-harm. [83] Studies also indicate that males who self-harm may also be at a greater risk of completing suicide. [11]

There does not appear to be a difference in motivation for self-harm in adolescent males and females. Triggering factors such as low self-esteem and having friends and family members who self-harm are also common between both males and females. [80] One limited study found that, among those young individuals who do self-harm, both genders are just as equally likely to use the method of skin-cutting. [90] However, females who self-cut are more likely than males to explain their self-harm episode by saying that they had wanted to punish themselves. In New Zealand, more females are hospitalised for intentional self-harm than males. Females more commonly choose methods such as self-poisoning that generally are not fatal, but still serious enough to require hospitalisation. [91]

Elderly Edit

In a study of a district general hospital in the UK, 5.4% of all the hospital's self-harm cases were aged over 65. The male to female ratio was 2:3 although the self-harm rates for males and females over 65 in the local population were identical. Over 90% had depressive conditions, and 63% had significant physical illness. Under 10% of the patients gave a history of earlier self-harm, while both the repetition and suicide rates were very low, which could be explained by the absence of factors known to be associated with repetition, such as personality disorder and alcohol abuse. [23] However, NICE Guidance on Self-harm in the UK suggests that older people who self-harm are at a greater risk of completing suicide, with 1 in 5 older people who self-harm going on to end their life. [20] A study completed in Ireland showed that older Irish adults have high rates of deliberate self-harm, but comparatively low rates of suicide. [85]

Developing world Edit

Only recently have attempts to improve health in the developing world concentrated on not only physical illness but also mental health. [92] Deliberate self-harm is common in the developing world. Research into self-harm in the developing world is however still very limited although an important case study is that of Sri Lanka, which is a country exhibiting a high incidence of suicide [93] and self-poisoning with agricultural pesticides or natural poisons. [92] Many people admitted for deliberate self-poisoning during a study by Eddleston et al. [92] were young and few expressed a desire to die, but death was relatively common in the young in these cases. The improvement of medical management of acute poisoning in the developing world is poor and improvements are required in order to reduce mortality.

Some of the causes of deliberate self-poisoning in Sri Lankan adolescents included bereavement and harsh discipline by parents. The coping mechanisms are being spread in local communities as people are surrounded by others who have previously deliberately harmed themselves or attempted suicide. [92] One way of reducing self-harm would be to limit access to poisons [92] however many cases involve pesticides or yellow oleander seeds, and the reduction of access to these agents would be difficult. Great potential for the reduction of self-harm lies in education and prevention, but limited resources in the developing world make these methods challenging.

Prison inmates Edit

Deliberate self-harm is especially prevalent in prison populations. A proposed explanation for this is that prisons are often violent places, and prisoners who wish to avoid physical confrontations may resort to self-harm as a ruse, either to convince other prisoners that they are dangerously insane and resilient to pain or to obtain protection from the prison authorities. [94] Self-harm also occurs frequently in inmates who are placed in solitary confinement. [95]


The Surprising History of the Lobotomy

Today, the word &ldquolobotomy&rdquo is rarely mentioned. If it is, it&rsquos usually the butt of a joke.

But in the 20 th century, a lobotomy became a legitimate alternative treatment for serious mental illness, such as schizophrenia and severe depression. Physicians even used it to treat chronic or severe pain and backaches. (As you&rsquoll learn below, in some cases, there was no compelling reason for the surgery at all.) There&rsquos a surprising history of the lobotomy for its use in mental health.

A lobotomy wasn&rsquot some primitive procedure of the early 1900s. In fact, an article in Wired magazine states that lobotomies were performed &ldquowell into the 1980s&rdquo in the &ldquoUnited States, Britain, Scandinavia and several western European countries.&rdquo

The Beginning

In 1935, Portuguese neurologist Antonio Egas Moniz performed a brain operation he called &ldquoleucotomy&rdquo in a Lisbon hospital. This was the first-ever modern leucotomy to treat mental illness, which involved drilling holes in his patient&rsquos skull to access the brain. For this work, Moniz received the Nobel Prize in medicine in 1949.

The idea that mental health could be improved by psychosurgery originated from Swiss neurologist Gottlieb Burckhardt. He operated on six patients with schizophrenia and reported a 50 percent success rate, meaning the patients appeared to calm down. Interestingly, Burckhardt&rsquos colleagues harshly criticized his work at the time.

The Lobotomy in America

In 1936, psychiatrist Walter Freeman and another neurosurgeon performed the first U.S. prefrontal lobotomy on a Kansas housewife. (Freeman renamed it &ldquolobotomy.&rdquo)

Freeman believed that an overload of emotions led to mental illness and &ldquothat cutting certain nerves in the brain could eliminate excess emotion and stabilize a personality,&rdquo according to a National Public Radio article.

He wanted to find a more efficient way to perform the procedure without drilling into a person&rsquos head like Moniz did. So he created the 10-minute transorbital lobotomy (known as the &ldquoice-pick&rdquo lobotomy), which was first performed at his Washington, D.C. office on January 17, 1946.

(Freeman would go on to perform about 2,500 lobotomies. Known as a showman, he once performed 25 lobotomies in one day. To shock his audiences, he also liked to insert picks in both eyes simultaneously.)

According to the NPR article, the procedure went as follows:

&ldquoAs those who watched the procedure described it, a patient would be rendered unconscious by electroshock. Freeman would then take a sharp ice pick-like instrument, insert it above the patient&rsquos eyeball through the orbit of the eye, into the frontal lobes of the brain, moving the instrument back and forth. Then he would do the same thing on the other side of the face.&rdquo

Freeman&rsquos ice-pick lobotomy became wildly popular. The main reason is that people were desperate for treatments for serious mental illness. This was a time before antipsychotic medication, and mental asylums were overcrowded, Dr. Elliot Valenstein, author of Great and Desperate Cures, which recounts the history of lobotomies, told NPR.

&ldquoThere were some very unpleasant results, very tragic results and some excellent results and a lot in between,&rdquo he said.

Lobotomies weren&rsquot just for adults either. One of the youngest patients was a 12-year-old boy! NPR interviewed Howard Dully in 2006 at the age of 56. At the time, he was working as a bus driver.

&ldquoIf you saw me you&rsquod never know I&rsquod had a lobotomy,&rdquo Dully says. &ldquoThe only thing you&rsquod notice is that I&rsquom very tall and weigh about 350 pounds. But I&rsquove always felt different &mdash wondered if something&rsquos missing from my soul. I have no memory of the operation, and never had the courage to ask my family about it&hellip&rdquo

The reason for Dully&rsquos lobotomy? His stepmother, Lou, said Dully was defiant, daydreamed and even objected to going to bed. If this sounds like a typical 12-year-old boy, that&rsquos because he was. According to Dully&rsquos father, Lou took her stepson to several doctors, who said there was nothing wrong with Dully, and he was just &ldquoa normal boy.&rdquo

But Freeman agreed to perform the lobotomy. You can check out the NPR article for Freeman&rsquos notes on Dully and more from his patients&rsquo families. (There&rsquos also lots more on lobotomies on their website.)

In 1967, Freeman performed his last lobotomy before being banned from operating. Why the ban? After he performed the third lobotomy on a longtime patient of his, she developed a brain hemorrhage and passed away.

The U.S. performed more lobotomies than any other country, according to the Wired article. Sources vary on the exact number but it&rsquos between 40,000 and 50,000 (the majority taking place between the late 1940s and early 1950s).

Curiously, as early as the 1950s, some nations, including Germany and Japan, had outlawed lobotomies. The Soviet Union prohibited the procedure in 1950, stating that it was &ldquocontrary to the principles of humanity.&rdquo

This article lists the &ldquotop 10 fascinating and notable lobotomies,&rdquo including an American actor, a renowned pianist, the sister of an American president and the sister of a prominent playwright.

What have you heard about lobotomies? Are you surprised by the history of the procedure?

Photo by frostnova, available under a Creative Commons attribution license.


In Closing…

Throughout the history of mankind, great forests blanketed many parts of the world. They provided civilization with a valuable and plentiful resource: wood.

Wood was a material easy to work with and shape, so artisans used it in many diverse ways. They created weapons and siege devices from wood. They built houses, temples, boats, furniture, plows, and even coffins using local woods, or for special needs, imported fine, aromatic woods from distant lands. They also sculpted statues and other decorative pieces from wood. When stone structures were erected, woodworkers used wood scaffolding to aid in their construction.

As civilizations advanced, they invented new tools to cut and shape wood, or improved existing ones. Most of the hand tools woodworkers’ use today have changed little since ancient times.

The inability to monitor the moisture content of a piece of wood and allow it to acclimate to the surrounding environment before using it has led, regrettably, to the ruin of many finished objects. It is one reason why countless wooden objects from centuries past have vanished forever.

We’d like to do our part to make sure that doesn’t happen to your project.


Free Download – 6 Reasons Your Wood Project Failed

1 Here are three websites that contend Noah’s ark was discovered in the mountains of Ararat in present-day Turkey:

It should be noted that there are a number of experts who hotly dispute the finding of the ark. However, Archaeologist Robert Ballard, who found the Titanic, says he has compelling evidence that suggests a monstrous ancient flood did indeed occur. While Ballard cannot say for certain the ark existed, the Biblical flood story is similar in some respects to the Babylonian epic of Gilgamesh, reports National Geographic. Also, the ancient Greeks, Romans, and Native Americans all have their own take on a legendary flood.


Watch the video: Ποιες παθήσεις αποκαλύπτουν τα νύχια σας; Η Δρ. Τασιούλα εξηγεί (January 2022).