Melon Farmers Unrated

Glorification of Censorship


Climate of fear caused by glorification of terrorsim


 

Terrifying censorship...

Labour MP tables bill amendment requiring social media companies to take down posts within 24 hours of an official complaint


Link Here 29th June 2018
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
Google, Facebook, YouTube and other sites would be required by law to take down extremist material within 24 hours of receiving an official complaint under an amendment put forward for inclusion in new counter-terror legislation.

The Labour MP Stephen Doughty's amendment echoes censorship laws that came into effect in Germany last year. However the effect of the German law was to enable no-questions-asked censorship of anything the government doesn't like. Social media companies have no interest in challenging unfair censorship and find the easiest and cheapest way to comply is to err on the side of the government, and take down anything asked regardless of the merits of the case.

The counter-terrorism strategy unveiled by the home secretary, Sajid Javid, this month, said the Home Office would place a renewed emphasis on engagement with internet providers and work with the tech industry to seek more investment in technologies that automatically identify and remove terrorist content before it is accessible to all.

But Doughty, a member of the home affairs select committee, said his amendment was needed because the voluntary approach was failing. He said a wide variety of extremist content remained online despite repeated warnings.

If these companies can remove copyrighted video or music content from companies like Disney within a matter of hours, there is no excuse for them to be failing to do so for extremist material.

Doughty's amendment would also require tech companies to proactively check content for extremist material and take it down within six hours of it being identified.

The proactive check of content alludes to the censorship machines being introduced by the EU to scan uploads for copyrighted material. The extension to detect terrorist material coupled with the erring on the side of caution approach would inevitably lead to the automatic censorship of any content even using vocabulary of terrorism, regardless of it being news reporting, satire or criticsim.

 

 

Police ratchet...

Ratcheting up the requirements for social networks to report posters of terrorist content to the police


Link Here8th March 2018
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
Social media giants have been ramping up internet censorship to prevent or take down terrorist posts. However the police are now complaining that the companies are not proactively reporting such posters to the police.

Metropolitan Police Assistant Commissioner Mark Rowley, the outgoing chief of UK counter-terror policing, said they are threatening public safety to maximise profit and customer satisfaction. Speaking at a counter-terror conference in London, Mr Rowley said social media firms should work with police in the same way banks had been made to co-operate on tracing dirty money. He said:

The online extremists seem able to act with impunity, occupying spaces owned and managed by legitimate and very wealthy corporations.

I am disappointed that in the UK as a police service we are yet to receive a direct referral from them when they have identified such behaviour.

They are effectively private tenants to their communication service provider landlords. In the real world if a landlord knew their property was being used to plan or inspire terrorist attacks you would expect them to show corporate responsibility by informing the authorities and evicting them forthwith. We want to see those same standards applied in the virtual world.

 

 

Commented: Maybe more about asking why Google can't do the same...

The UK reveals a tool to detect uploads of jihadi videos


Link Here15th February 2018
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim

The UK government has unveiled a tool it says can accurately detect jihadist content and block it from being viewed.

Home Secretary Amber Rudd told the BBC she would not rule out forcing technology companies to use it by law. Rudd is visiting the US to meet tech companies to discuss the idea, as well as other efforts to tackle extremism.

The government provided £600,000 of public funds towards the creation of the tool by an artificial intelligence company based in London.

Thousands of hours of content posted by the Islamic State group was run past the tool, in order to train it to automatically spot extremist material.

ASI Data Science said the software can be configured to detect 94% of IS video uploads. Anything the software identifies as potential IS material would be flagged up for a human decision to be taken.

The company said it typically flagged 0.005% of non-IS video uploads. But this figure is meaningless without an indication of how many contained any content that have any connection with jihadis.

In London, reporters were given an off-the-record briefing detailing how ASI's software worked, but were asked not to share its precise methodology. However, in simple terms, it is an algorithm that draws on characteristics typical of IS and its online activity.

It sounds like the tool is more about analysing data about the uploading account, geographical origin, time of day, name of poster etc rather than analysing the video itself.

Comment: Even extremist takedowns require accountability

15th February 2018. See  article from openrightsgroup.org

Can extremist material be identified at 99.99% certainty as Amber Rudd claims today? And how does she intend to ensure that there is legal accountability for content removal?

The Government is very keen to ensure that extremist material is removed from private platforms, like Facebook, Twitter and Youtube. It has urged use of machine learning and algorithmic identification by the companies, and threatened fines for failing to remove content swiftly.

Today Amber Rudd claims to have developed a tool to identify extremist content, based on a database of known material. Such tools can have a role to play in identifying unwanted material, but we need to understand that there are some important caveats to what these tools are doing, with implications about how they are used, particularly around accountability. We list these below.

Before we proceed, we should also recognise that this is often about computers (bots) posting vast volumes of material with a very small audience. Amber Rudd's new machine may then potentially clean some of it up. It is in many ways a propaganda battle between extremists claiming to be internet savvy and exaggerating their impact, while our own government claims that they are going to clean up the internet. Both sides benefit from the apparent conflict.

The real world impact of all this activity may not be as great as is being claimed. We should be given much more information about what exactly is being posted and removed. For instance the UK police remove over 100,000 pieces of extremist content by notice to companies: we currently get just this headline figure only. We know nothing more about these takedowns. They might have never been viewed, except by the police, or they might have been very influential.

The results of the government's' campaign to remove extremist material may be to push them towards more private or censor-proof platforms. That may impact the ability of the authorities to surveil criminals and to remove material in the future. We may regret chasing extremists off major platforms, where their activities are in full view and easily used to identify activity and actors.

Whatever the wisdom of proceeding down this path, we need to be worried about the unwanted consequences of machine takedowns. Firstly, we are pushing companies to be the judges of legal and illegal. Secondly, all systems make mistakes and require accountability for them; mistakes need to be minimised, but also rectified.

Here is our list of questions that need to be resolved.

1 What really is the accuracy of this system?

Small error rates translate into very large numbers of errors at scale. We see this with more general internet filters in the UK, where our blocked.org.uk project regularly uncovers and reports errors.

How are the accuracy rates determined? Is there any external review of its decisions?

The government appears to recognise the technology has limitations. In order to claim a high accuracy rate, they say at least 6% of extremist video content has to be missed. On large platforms that would be a great deal of material needing human review. The government's own tool shows the limitations of their prior demands that technology "solve" this problem.

Islamic extremists are operating rather like spammers when they post their material. Just like spammers, their techniques change to avoid filtering. The system will need constant updating to keep a given level of accuracy.

2 Machines are not determining meaning

Machines can only attempt to pattern match, with the assumption that content and form imply purpose and meaning. This explains how errors can occur, particularly in missing new material.

3 Context is everything

The same content can, in different circumstances, be legal or illegal. The law defines extremist material as promoting or glorifying terrorism. This is a vague concept. The same underlying material, with small changes, can become news, satire or commentary. Machines cannot easily determine the difference.

4 The learning is only as good as the underlying material

The underlying database is used to train machines to pattern match. Therefore the quality of the initial database is very important. It is unclear how the material in the database has been deemed illegal, but it is likely that these are police determinations rather than legal ones, meaning that inaccuracies or biases in police assumptions will be repeated in any machine learning.

5 Machines are making no legal judgment

The machines are not making a legal determination. This means a company's decision to act on what the machine says is absent of clear knowledge. At the very least, if material is "machine determined" to be illegal, the poster, and users who attempt to see the material, need to be told that a machine determination has been made.

6 Humans and courts need to be able to review complaints

Anyone who posts material must be able to get human review, and recourse to courts if necessary.

7 Whose decision is this exactly?

The government wants small companies to use the database to identify and remove material. If material is incorrectly removed, perhaps appealed, who is responsible for reviewing any mistake?

It may be too complicated for the small company. Since it is the database product making the mistake, the designers need to act to correct it so that it is less likely to be repeated elsewhere.

If the government want people to use their tool, there is a strong case that the government should review mistakes and ensure that there is an independent appeals process.

8 How do we know about errors?

Any takedown system tends towards overzealous takedowns. We hope the identification system is built for accuracy and prefers to miss material rather than remove the wrong things, however errors will often go unreported. There are strong incentives for legitimate posters of news, commentary, or satire to simply accept the removal of their content. To complain about a takedown would take serious nerve, given that you risk being flagged as a terrorist sympathiser, or perhaps having to enter formal legal proceedings.

We need a much stronger conversation about the accountability of these systems. So far, in every context, this is a question the government has ignored. If this is a fight for the rule of law and against tyranny, then we must not create arbitrary, unaccountable, extra-legal censorship systems.

 

 

Artificial intelligence expectations...

Amber Rudd calls for the AI blocking of terrorist content before it is posted


Link Here 11th November 2017
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
Home Secretary Amber Rudd told an audience at New America, a Washington think tank, on Thursday night that there was an online arms race between militants and the forces of law and order.

She said that social media companies should press ahead with development and deployment of AI systems that could spot militant content before it is posted on the internet and block it from being disseminated.

Since the beginning of 2017, violent militant operatives have created 40,000 new internet destinations, Rudd said. As of 12 months ago, social media companies were taking down about half of the violent militant material from their sites within two hours of its discovery, and lately that proportion has increased to two thirds, she said.

YouTube is now taking down 83% of violent militant videos it discovers, Rudd said, adding that UK authorities have evidence that the Islamic State was now struggling to get some of its materials online.

She added that in the wake of an increasing number of vehicle attacks by islamic terrorists British security authorities were reviewing rental car regulations and considering ways for authorities to collect more relevant data from car hire companies.

 

 

Offsite Article: Demos think tank doesn't buy the idea that terrorists are radicalised by internet content...


Link Here8th November 2017
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
The truth is that a lot of the material that terrorists share is not actually illegal at all. Instead, it was often comprised of news reports about perceived injustices in Palestine, stuff that you could never censor in a free society.

See article from news.sky.com

 

 

Update: So what are the chances that this will have any measurable effect on terrorism?...

May calls on website services to remove terrorist content within 2 hours


Link Here21st September 2017
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim

Theresa May is to urge internet companies to take down extremist content being shared by terrorist groups within two hours, during a summit with the French president and the Italian prime minister.

Home Office analysis shows that Isis shared 27,000 links to extremist content in the first five months of the 2017 and, once shared, the material remained available online for an average of 36 hours. The government would like that reduced to two hours, and ultimately they are urging companies to develop technology to spot material early and prevent it being shared in the first place.

The issue is of particular concern after last week's attack on a London Underground train at Parsons Green, and follows a British thinktank report, which found that online jihadist propaganda attracts more clicks in Britain than anywhere else in Europe.

Extremist material is shared very rapidly when it is first published in what experts call a starburst effect: more than two-thirds of shares take place within the first two hours, so reducing the amount of time the material is visible can drastically squeeze the number of users who see it.

A government source noted that once an internet user has shown interest in extremist content, the web giants' algorithms keep pushing similar material towards them online. We want them to break the echo chambers, he said.

 

 

More censors to solve Britain's terrorism problem...

Government pushes for the likes of Facebook to employ thousands of censors to vet peoples posts before being published


Link Here8th September 2016
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim

Government censors are struggling to stop the spread of extremist messages on the internet despite taking down 1,000 videos a week, the Home Secretary has admitted. Amber Rudd said she was in talks with social media websites about setting up a new industry standard board to agree the rules setting out when sites should be taken down.

The new home secretary was grilled by MPs on the House of Commons' Home Affairs committee about what more could be done to force US sites like Twitter, Facebook and YouTube to take action. It is alarming that these companies have teams of only a few hundred employees to monitor networks of billions of accounts Home Affairs select committee report

Rudd said that major internet companies could take more responsibility:

Because the speed these damaging videos get put up and then we manage to take down -- at the moment we are taking down 1,000 a week of these sites -- is too slow compared to the speed at which they are communicated.

I do think more can be done and we are in discussions with industry to see what more they are prepared to do.

We would like to see a form of industry standard board that they could put together in order to have an agreement of oversight and to take action much more quickly on sites which will do such damage to people in terms of making them communicating terrorist information.

Rudd said the new industry standards board could be similar to an existing board which protects children from sexual exploitation, presumably referring to the IWF.

The committee's report said:

It is alarming that these companies have teams of only a few hundred employees to monitor networks of billions of accounts and that Twitter does not even proactively report extremist content to law enforcement agencies.

These companies are hiding behind their supranational legal status to pass the parcel of responsibility and refusing to act responsibly in case they damage their brands. If they continue to fail to tackle this issue and allow their platforms to become the 'Wild West' of the internet, then it will erode their reputation as responsible operators.

Internet companies should be required to co-operate with Britain's counter-extremism police and shut down accounts immediately.

 

 

Offsite Comment: Terrorist slippery slope...


Link Here 2nd September 2015
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
The UK government should look to what is happening to free expression in Egypt and Turkey before broadening terrorist laws to include those who spread hate. By Jodie Ginsberg

See article from opendemocracy.net

 

 

Updated: Ban Everything!...

David Cameron calls for more internet censorship of 'extremist' material. And no doubt the authorities will define 'extremist' as meaning more or less everything


Link Here16th November 2014
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
David Cameron has called for governments around the world to do more to censor 'extremist' material online. He made his comments during a visit to Australia's Parliament. He said:

The root cause of the challenge we face is the extremist narrative. A new and pressing challenge is getting extremist material taken down from the Internet. There is a role for government in that. We must not allow the Internet to be an ungoverned space. But there is a role for companies too.

Cameron then went on to detail measures already being taken in the UK to combat online extremism, including adding supposedly extremists material to ISP blocking lists, improving reporting mechanisms and being more proactive in taking down supposedly harmful material.

The British government also recently revealed plans to reduce the amount of hate material online. However, a report released in May revealed that the proposal is experiencing a number of hurdles, including opposition from ISPs and social networks, particularly those based outside the UK.

Open Rights Group has responded to the announcement that ISPs will add extremist websites to filters designed to protect children from seeing adult content. Jim Killock, Executive Director, Open Rights Group said:

We need transparency whenever political content is blocked even when we are talking about websites that espouse extremist views. The government must be clear about what sites they think should be blocked, why they are blocking them and whether there will be redress for site owners who believe that their website has been blocked incorrectly.

Given the low uptake of filters, it is difficult to see how effective the government's approach will be when it comes to preventing young people from seeing material they have deemed inappropriate. Anyone with an interest in extremist views can surely find ways of circumventing child friendly filters

Update: Censorship button

16th November 2014. See article from bbc.co.uk

The UK's major internet service providers (ISPs) are to introduce new measures to tackle online extremism, Downing Street has said. The ISPs had committed to strengthening their filters and adding a public reporting button to flag terrorism-related material. In a briefing note, No 10 said the ISPs had subsequently committed to filtering out extremist and terrorist material, and hosting a button that members of the public could use to report content. It would work in a similar fashion to the reporting button that allows the public to flag instances of child sexual exploitation on the internet.

However, the BBC understands that while the ISPs agreed in principle to do more to prevent extremism, they have not actually committed to the measures outlined by No 10.

We have had productive dialogue with government about addressing the issue of extremist content online and we are working through the technical details, a spokeswoman for BT said. A spokesman for Sky said: We're exploring ways in which we can help our customers report extremist content online, including hosting links on our website. The plan presents logistical problems as extremist groups such as Isis typically use channels like YouTube or Twitter that are popular for entirely legal purposes.

 

 

Update: Reading a bit too much into 'following' someone on Twitter...

Social media told to counter terrorist propaganda with government propaganda


Link Here24th October 2014
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim

Senior British executives from Twitter, Google and Facebook were summoned to Downing Street on Thursday and told to do more to take action to curb the online activities of extremists. The Home Office and Crown Prosecution Service are in talks about using court orders to ensure that ISPs immediately remove extremist propaganda.

The warning came as it transpired that Britain's most high-profile radical Islamist preacher, Anjem Choudary, had influenced the man involved in the Ottawa attack. Canadian terrorist Martin Ahmad Rouleau's Twitter account showed that he followed several radical preachers, including Choudary, who tweeted that he hoped that the Canadian attacker would be admitted to heaven.

However, Choudary said: The fact that someone follows you on Twitter does not mean you necessarily influenced him to do anything.

As part of the plans, the Government also wants to encourage social media sites to use so-called counter-speech tactics, which involves positive messages about Islam online to prevent extremists monopolising websites.

 

 

Update: Religious ASBOs...

Theresa May responds to China's call to censor religious extremism on the internet


Link Here30th September 2014
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
China called on Saturday for a worldwide crackdown on the use of the Internet by religious extremists and terrorists to stamp out their ability to communicate their ideas and raise funds.

China's Foreign Minister Wang Yi made the remarks during the annual gathering of the 193-nation U.N. General Assembly in New York. he said:

As new developments emerge in the global fight against terrorism, the international community should take new measures to address them.

In particular, it should focus on combating religious extremism and cyber terrorism, resolutely eliminate the roots and block channels of spreading terrorism and extremism.

Theresa May responded on Tuesday for the British government.

She announced policies for new Extremist Disruption Orders. Extremists will have to get posts on Facebook and Twitter approved in advance by the police under sweeping rules planned by the Conservatives. They will also be barred from speaking at public events if they represent a threat to the functioning of democracy , under the new Extremist Disruption Orders.

Theresa May, the Home Secretary, will lay out plans to allow judges to ban people from broadcasting or protesting in certain places, as well as associating with specific people.

The Home Secretary will also introduce banning orders for extremist groups, which would make it a criminal offence to be a member of or raise funds for a group that spreads or promotes hatred. The maximum sentence could be up to 10 years in prison.

 

 

Offsite Article: The internet censorship programme you're not allowed to know about...


Link Here28th March 2014
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
The secretive development of a process to censor websites associated with terrorism. By Jane Fae

See article from politics.co.uk

 

 

Update: Unsavoury Censorship...

British Government granted facility for Google to instantly consider consider takedown requests for YouTube videos. Also the Government wants to censor unsavoury but not illegal content


Link Here 14th March 2014
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim
Google gives UK internet censors super flagger status to give high priority requests to get YouTube videos taken down.

YouTube will instantly screen any content flagged by British security officials. The censors will be able to flag multiple videos at scale rather than needing to flag each offending video.

The UK's security and immigration minister, James Brokenshire, worryingly told the Financial Times the government has to do more to deal with material that may not be illegal but certainly is unsavoury and may not be the sort of material that people would want to see or receive.

Brokenshire also said issues being considered by the government included a code of conduct for internet service providers and companies. The government, he added, was also keen to explore options where search engines and social media sites change their algorithms so that unsavoury content is less likely to appear or is served up with more balanced material.

Google confirmed that the Home Office had been given powerful flagging permissions on YouTube but stressed that Google itself still retained the ultimate decision on whether to remove content for breaching its community guidelines.

 

 

Update: A Dangerous Fantasy...

Make believe 'terrorist' jailed for 5 years for posting terrorist material


Link Here19th January 2013
Full story: Glorification of Censorship...Climate of fear caused by glorification of terrorsim

A total fantasist who posted gruesome videos on Facebook of al-Qaeda beheading captives has been jailed for five years. Craig Slee pleaded guilty to four offences under the 2006 Terrorism Act and also admitted possession of can of CS gas.

On sentencing him at Preston Crown Court, Judge Anthony Russell QC said:

It beggars belief that anyone can have an interest in such material which reveals a shocking and barbaric depravity and complete absence of any degree of humanity.

Slee also put online links to a communique by al-Qaeda in the Islamic Maghreb (AQIM), claiming those from the west were Crusaders and encouraging terrorism.

The court heard, Slee created a false identity and set up a Facebook page - using the alter-ego Hashim X Shakur. Slee claimed to be a Muslim and provided personal information about himself, the majority of which was false. He also engaged in Facebook chat with other people and kept up his pretence of his alter-ego, claiming he had been on trips to Jalalabad, had suffered shrapnel injuries and implied he was a member of the Taliban, said police. However, the court heard Slee has no connection to the Taliban, Al-Qaeda or any other terrorist network or organisation.

Det Ch Supt Tony Mole, head of the North West Counter Terrorism Unit, said:

It is clear that Slee was a total fantasist.He had no links whatsoever to any terrorist organisations, was not a radical convert and there is no evidence whatsoever to suggest he engaged in any attack planning.

While Slee may not have been planning any sort of attack, he could easily have influenced someone else with the propaganda he was uploading.

 

6th February
2012
  

Update: Radical Findings...

Parliamentary Committee find that ISPs should monitor the internet for websites radicalising religious extremists

Website should be monitored and material that promotes violent extremism should be removed. A nine-month inquiry by the Commons home affairs select committee concluded the internet is a fertile breeding ground for terrorism and plays a part in most, if not all, cases of violent radicalisation.

ISPs should be more active in monitoring sites and the government should work with them to develop a code of practice for removing material that could lead to radicalisation, the report said.

The inquiry found that the internet played a greater role in violent radicalisation than prisons, universities or places of worship, and was now one of the few unregulated spaces where radicalisation is able to take place .

But it added that a sense of grievance was key, and direct personal contact with radicals was a significant factor . The government's counter-terrorism strategy should show the British state is not antithetical to Islam , the committee said. Keith Vaz, its chairman, said:

More resources need to be directed to these threats and to preventing radicalisation through the internet and in private spaces. These are the fertile breeding grounds for terrorism.

The July 7 bombings in London, carried out by four men from West Yorkshire, were a powerful demonstration of the devastating and far-reaching impact of home-grown radicalisation.

We remain concerned by the growing support for non-violent extremism and more extreme and violent forms of far-right ideology.

He added that a policy of engagement, not alienation would prevent radicalisation and called for the government's counter-radicalisation strategy Prevent to be renamed Engage.

Nick Pickles, director of civil liberties and privacy group Big Brother Watch, said:

Whatever the reason for blocking online content, it should be decided in court and not by unaccountable officials.

There is a serious risk that this kind of censorship not only makes the internet less secure for law-abiding people, but drives underground the real threats and makes it harder to protect the public.

 

30th July
2011
  

Update: Inciting Violence...

First conviction under law against inciting religious hatred

Jailing Bilal Zaheer Ahmad for 12 years, Mr Justice Royce said he was sending out a loud and clear warning that Britain would not tolerate extremists preaching messages of hate and violence.

Ahmad who called on Muslims to murder MPs who supported the Iraq war, was the first person to be found guilty of inciting religious hatred under new laws banning the publication of inflammatory material.

The IT worker praised 21-year-old university student Roshonara Choudhry as a heroine for stabbing Stephen Timms in east London in May last year. Ahmad called on other Muslims to follow in her footsteps by attacking and killing politicians who had voted to support the war in Iraq. He posted a full list of MPs and provided an internet link to their personal contact details, suggesting constituency surgeries were a good place to encounter them in person .

The judge told Ahmad: You purport to be a British citizen, but what you stand for is totally alien to what we stand for in our country. You became a viper in our midst willing to go to as far as possible to strike at the heart of our system.

 

8th April
2011
  

Update: 'Offensive' Terrorism?...

Government advertise website to report terrorism and extremism on the internet

Information leaflets and posters have been sent to every police force in the UK advising the public on how to identify and report offensive or illegal terrorism related content.

Security minister, Baroness Neville-Jones, said that it's vital that online extremism is taken seriously: I want to encourage those who come across extremist websites as part of their work to challenge it and report it through the DirectGov webpage.

By forging relationships with the internet industry and working with the public in this way, we can ensure that terrorist use of the internet does not go unchallenged.

Websites reported to Directgov via its online form are referred to the national Counter Terrorism Internet Referral Unit. The specialist team of police experts work with industry and partners in the UK and abroad to investigate and take down illegal or offensive material if necessary.

In the last year, reporting through Directgov has led the government to remove content which has included beheading videos, terrorist training manuals and calls for racial or religious violence.

The Reporting extremism and terrorism online website defines what content is of interest:

What makes offensive content illegal

Not all offensive content is illegal.

The Terrorism Acts 2000 and 2006 made it illegal to:

  • have or share information that could be useful to terrorists
  • share information that urges people to commit or help with acts of terrorism
  • glorify or praise terrorism

Examples of what makes terrorist or extremist content illegal are:

  • speeches or essays calling for racial or religious violence
  • videos of violence with messages of praise for the attackers
  • chat forums with postings calling for people to commit acts of terrorism
  • messages intended to stir up hatred against any religious or ethnic group
  • instructions on how to make weapons, poisons or bombs

 

11th November
2010
  

Update: Inciting Violence...

Man arrested for website encouraging attacks on MPs over Iraq war

Police have arrested a man on suspicion of encouraging muslims to attack MPs.

The individual is thought to be involved with a website that praised the stabbing of the MP Stephen Timms and published a list of other MPs who voted for the war in Iraq, along with details of where to buy a knife.

West Midlands Counter Terrorism Unit arrested the man and conducted a search of his home in the Dunstall area of Wolverhampton. Officers seized computer and electronic equipment, police said.

The man was being questioned under section one of the Terrorism Act 2006 on suspicion of encouraging an act of terrorism.

Detective Chief Inspector John Denley said: We are treating the contents and implications of this blog very seriously, and have taken action this morning to progress our investigation.

The website, Revolution Muslim, was hosted in Bellevue, Washington, and was taken down by the Americans at the request of the Home Office.

The website praised Roshonara Choudhry, who tried to stab Timms to death during a constituency surgery in Beckton, East London.

The website said: We ask Allah for her action to inspire Muslims to raise the knife of jihad against those who voted for the countless rapes, murders, pillages, and torture of Muslim civilians as a direct consequence of their vote.

The statement added: If you want to track an MP, you can find out their personal website after typing their name in this website.

A link on the website took the reader to the site of Tesco Direct for a £15 kitchen knife, similar to that used by Choudhry.

The site also featured videos and statements by Awlaki and by former members of al-Muhajiroun, Anjem Choudary and Omar Bakri Mohammed.

Update: Charged

21st November 2010. See article from  guardian.co.uk

A man appeared in court charged with soliciting murder and offences under the Terrorism Act in relation to a blog listing MPs it claimed voted for the Iraq war. Bilal Zaheer Ahmad, from Wolverhampton, was arrested last week over the blog, which allegedly called for action against the MPs.

The details appeared on a website that was said to have radicalised a young woman who went on to stab the former minister Stephen Timms during an advice surgery in east London in revenge for the Iraq war. Ahmad appeared handcuffed as he stood between two security officers in the dock at London's City of Westminster Magistrates' Court.

He was remanded in custody to appear at the Old Bailey on 29 November.

 

12th December
2009
  

Update: Glorified Censorship...

Student given 6 months in jail for DVD containing scenes of terrorist atrocities

A Pakistani student was sentenced to six months in prison for sending a DVD containing scenes of terrorist atrocities to his neighbours.

Illegal immigrant Bilal Malik a student at Dundee University, admitted a breach of the peace.

He faces deportation after serving his sentence.

 

17th November
2009
  

Update: Glorifying Censorship...

No records kept of action against websites promoting terrorism

The Terrorism Act 2006 granted powers for police to compel web hosts to shut down websites promoting terrorism. But the powers have never been used, and forces have instead persuaded providers to take down websites voluntarily, according to the security minister Lord West.

He told the Lords on Wednesday that he could not say how many websites have been censored because no records have been kept.

When we passed the Act in 2006, we laid down a requirement to make such records, but it has not really been done, he said.

When measures against extremist websites were announced, the government suggested ISPs might introduce filtering arrangements similar to the Internet Watch Foundation's blocklist of URLs leading to images of child abuse. No system has emerged, however, and industry sources say the idea is not being discussed.

 

15th February
2009

 Offsite: A Stalled Jihad...

Jacqui's jihad on web extremism flops

See article from theregister.co.uk

 

3rd November
2008

 Offsite: A victory for the terrorists...

nejsmith.jpgWebsite censorship erodes the very freedoms that the home secretary purports to defend

See article from guardian.co.uk




 

melonfarmers icon

Home

Index

Links

Email

Shop
 


US

World

Media

Nutters

Liberty
 

Film Cuts

Cutting Edge

Info

Sex News

Sex+Shopping
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys