Programming & IT Tricks . Theme images by MichaelJay. Powered by Blogger.

Copyright

Facebook

Post Top Ad

Search This Blog

Post Top Ad

Responsive Ads Here

Archive

Post Top Ad

Contact


Editors Picks

Follow us

Post Top Ad

Fashion

Music

News

Sports

Food

Technology

Featured

Videos

Fashion

Technology

Fashion

Label

Translate

About

Translate

Sponsor

test

Weekly

Comments

Recent

Connect With us

Over 600,000+ Readers Get fresh content from FastBlog

About

Showing posts with label Cybersecurity. Show all posts
Showing posts with label Cybersecurity. Show all posts

Thursday, April 5, 2018

Does your phone's battery issue?


Battery is lost in 3 to 4 hours due to the data pack in today's smartphone. Then we believe that our phone's battery is getting worse. And going over the net leads to a loss of data packs. But there is no cause for worry. Just changing your smartphone setting will trouble you away.
Change these settings to the phone
There are many settings in the phone that are always more cost-efficient. The battery is also down. So let's close those settings.


 
Uninstall useless apps from your phone. Or change the setting.
So if you want to save the phone's battery life and data, you can turn off these settings.
Go to your phone's settings and get the Google option. Click on it. There will be an option when clicking. Including data management to game and install app.
You can see many settings by clicking on the PLAY GAMES option. In which you have to sign in to Automatically and Use this account to sign in.
Below the Play Game, let's close the Request Notification option.

Tuesday, January 23, 2018

AI Innovation: Security and Privacy Challenges


To anyone working in technology (or, really, anyone on the Internet), the term “AI” is everywhere. Artificial intelligence — technically, machine learning — is finding application in virtually every industry on the planet, from medicine and finance to entertainment and law enforcement. As the Internet of Things (IoT) continues to expand, and the potential for blockchain becomes more widely realized, ML growth will occur through these areas as well.
While current technical constraints limit these models from reaching “general intelligence” capability, organizations continue to push the bounds of ML’s domain-specific applications, such as image recognition and natural language processing. Modern computing power (GPUs in particular) has contributed greatly to these recent developments — which is why it’s also worth noting that quantum computing will exponentialize this progress over the next several years.
Alongside enormous growth in this space, however, has been increased criticism; from conflating AI with machine learning to relying on those very buzzwords to attract large investments, many “innovators” in this space have drawn criticism from technologists as to the legitimacy of their contributions. Thankfully, there’s plenty of room — and, by extension, overlooked profit — for innovation with ML’s security and privacy challenges.

Reverse-Engineering

Machine learning models, much like any piece of software, are prone to theft and subsequent reverse-engineering. In late 2016, researchers at Cornell Tech, the Swiss Institute EPFL, and the University of North Carolina reverse-engineered a sophisticated Amazon AI by analyzing its responses to only a few thousand queries; their clone replicated the original model’s output with nearly perfect accuracy. The process is not difficult to execute, and once completed, hackers will have effectively “copied” the entire machine learning algorithm — which its creators presumably spent generously to develop.
The risk this poses will only continue to grow. In addition to the potentially massive financial costs of intellectual property theft, this vulnerability also poses threats to national security — especially as governments pour billions of dollars into autonomous weapon research.
While some researchers have suggested that increased model complexity is the best solution, there hasn’t been nearly enough open work done in this space; it’s a critical (albeit underpublicized) opportunity for innovation — all in defense of the multi-billion-dollar AI sector.

Adversarial “Injection”

Machine learning also faces the risk of adversarial “injection” — sending malicious data that disrupts a neural network’s functionality. Last year, for instance, researchers from four top universities confused image recognition systems by adding small stickers onto a photo, through what they termed Robust Physical Perturbation (RP2) attacks; the networks in question then misclassified the image. Another team at NYU showed a similar attack against a facial recognition system, which would allow a suspect individual to easily escape detection.
Not only is this attack a threat to the network itself (i.e. consider this against a self-driving car), but it’s also a threat to companies who outsource their AI development and risk contractors putting their own “backdoors” into the system. Jaime Blasco, Chief Scientist at security company AlienVault, points out that this risk will only increase as the world depends more and more on machine learning. What would happen, for instance, if these flaws persisted in military systems? Law enforcement cameras? Surgical robots?

Training Data Privacy

Protecting the training data put into machine learning models is yet another area that needs innovation. Currently, hackers can reverse-engineer user data out of machine learning models with relative ease. Since the bulk of a model’s training data is often personally identifiable information —e.g. with medicine and finance — this means anyone from an organized crime group to a business competitor can reap economic reward from such attacks.
As machine learning models move to the cloud (i.e. self-driving cars), this becomes even more complicated; at the same that users need to privately and securely send their data to the central network, the network needs to make sure it can trust the user’s data (so tokenizing the data via hashing, for instance, isn’t necessarily an option). We can once again abstract this challenge with everything from mobile phones to weapons systems.
Further, as organizations seek personal data for ML research, their clients might want to contribute to the work (e.g. improving cancer detection) without compromising their privacy (e.g. providing an excess of PII that just sits in a database). These two interests currently seem at odds — but they also aren’t receiving much focus, so we shouldn’t see this opposition as inherent. Smart redesign could easily mitigate these problems.

Conclusion

In short: it’s time some innovators in the AI space focused on its security and privacy issues. With the world increasingly dependent on these algorithms, there’s simply too much at stake — including a lot of money for those who address these challenges.

Friday, January 19, 2018

A Brief History of Cloud Computing


Threat Intel’s ‘History of…’ series will look at the origins and evolution of notable developments in cyber security.

What exactly is cloud computing? This is something that, no doubt, most people have wondered in recent times, as more and more of the services we use have migrated to the semi-mythical “cloud”.
One dictionary definition of cloud computing defines it as: “Internet-based computing in which large groups of remote servers are networked so as to allow sharing of data-processing tasks, centralized data storage, and online access to computer services or resources.” Users no longer need vast local services to access storage or carry out certain tasks, they can do it all “in the cloud”, which essentially means over the internet.
If we go back to the very beginning, we can trace cloud computing’s origins all the way back to the 1950s, and the concept of time-sharing. At that time, computers were both huge and hugely expensive, so not every company could afford to have one. To tackle this, users would “time-share” a computer. Basically, they would rent the right to use the computer’s computational power, and share the cost of running it. In a lot of ways, that remains the basic concept of cloud computing today.
In the 1970s, the creation of “virtual machines” took the time-share model to a new level. This development allowed multiple computing environments to be housed in one physical environment. This was a key development that made it possible for the cloud computing we know today to develop.
Professor Ramnath Chellappa is often credited with being the person who coined the term “cloud computing” in its modern context, at a lecture he delivered in 1997. He defined it then as a “computing paradigm where the boundaries of computing will be determined by economic rationale rather than technical limits alone.” However, some months before this, in 1996, a business plan created by a group of technologists at Compaq also used the term when discussing the “evolution” of computing. So, while the source of the expression might be in dispute, it is clear that the modern “cloud” was something that was being seriously thought about by those in the IT industry in the mid ’90s — 20 years ago.

Modern developments

In 2006, Amazon launched Amazon Web Services (AWS), which provided services such as computing and storage in the cloud. Back then, you could rent computer power or storage from Amazon by the hour. Nowadays, you can rent more than 70 services, including analytics, software and mobile services. Its S3 storage service holds reams of data and services millions of requests every second. Amazon Web Services is used by more than one million customers in 190 countries. Massive companies including Netflix and AOL don’t have their own data centers but exclusively use AWS. Its projected revenue for 2017 was $18 billion.
While the other major tech players, such as Microsoft Azure, did subsequently launch their own cloud offerings to compete with AWS, it dominates the cloud infrastructure market; according to recent reports, at the end of 2017 it held a 62 percent market share of the public cloud business, with Microsoft Azure holding 20 percent, and Google 12 percent. While AWS is still way ahead of its rivals in this space, it is interesting to note that its market share did drop since the previous year, while both Microsoft and Google’s market share grew.
While AWS dominates in the enterprise space, when it comes to consumers, they are probably most familiar with services like Dropbox, iCloud and Google Drive, which they use to store back-ups of photos, documents, and more. The increased use by people of mobile devices with smaller storage capacities increased the need for cloud-based storage among consumers. While they may lack understanding about what exactly the cloud is, it is likely that most consumers are using at least one cloud-based service. The cloud has allowed for the growth of the mobile economy, in many ways, allowing for the development of apps that may not have been possible in the absence of a cloud infrastructure.
In organizations, the numbers using cloud services is even larger. The Symantec ISTR 2017 showed that the average enterprise has 928 cloud apps in use, though many businesses don’t realize that their employees are actually using so many cloud services.
The growth of mobile devices led to an inevitable growth in cloud usage by consumers

Security concerns

However, while there are many advantages to cloud computing, and many reasons why companies and individuals use cloud services, it does present some security concerns. One of the appeals of information stored on the cloud is that it can be accessed remotely, however, if inadequate security protocols are in place, this is also one of its weaknesses. There have been many stories in the news about Amazon S3 buckets being left on the web unsecured and revealing personal information about people. However, as it seems unlikely that cloud computing is going anywhere, the answer to these kinds of issues is more likely to be improving people’s cyber security practices to ensure they protect data stored online with strong passwords and other forms of authentication, such as two-factor and encryption.
The adoption of cloud was almost inevitable in our hyper-connected world. The need for computing power and storage simply became too expensive and too much for many businesses and individuals to tackle, meaning they needed to farm out these tasks to cloud services. As the move to mobile continually escalates, and as the Internet of Things (IoT) continues to grow as a sector, cloud computing is set to continue its growth.
It may have started out as a marketing term, but cloud computing is an important reality in today’s IT world.
Check out the Security Response blog and follow Threat Intel on Twitter to keep up-to-date with the latest happenings in the world of threat intelligence and cybersecurity.
Like this story? Recommend it by hitting the heart button so others on Medium see it, and follow Threat Intel on Medium for more great content.

Interested for our works and services?
Get more of our update !