Latest YouTube Video

Saturday, April 14, 2018

Orioles: 2B Jonathan Schoop (oblique) placed on 10-day DL (ESPN)

from ESPN https://ift.tt/1eW1vUH
via IFTTT

Hackers Have Started Exploiting Drupal RCE Exploit Released Yesterday

Hackers have started exploiting a recently disclosed critical vulnerability in Drupal shortly after the public release of working exploit code. Two weeks ago, Drupal security team discovered a highly critical remote code execution vulnerability, dubbed Drupalgeddon2, in its content management system software that could allow attackers to completely take over vulnerable websites. <!-- adsense


from The Hacker News https://ift.tt/2vfkLR6
via IFTTT

Martian Chiaroscuro


Deep shadows create dramatic contrasts between light and dark in this high-resolution close-up of the martian surface. Recorded on January 24, 2014 by the HiRISE camera onboard the Mars Reconnaissance Orbiter, the scene spans about 1.5 kilometers. From 250 kilometers above the Red Planet the camera is looking down at a sand dune field in a southern highlands crater. Captured when the Sun was about 5 degrees above the local horizon, only the dune crests were caught in full sunlight. A long, cold winter is coming to the southern hemisphere and bright ridges of seasonal frost line the martian dunes. via NASA https://ift.tt/2JJcIPN

Friday, April 13, 2018

Re: [FD] new email; gw22067@hotmail.com | Double-free segfault bypass

Re: [FD] CVE-2018-7539 Directory Traversal on Appear TV Maintenance centre 8088

[FD] Call for Papers: USENIX Workshop on Offensive Technologies (WOOT '18)

Dear all, We are pleased to announce the Call for Papers for the 12th USENIX Workshop on Offensive Technologies! WOOT '18 will be held on August 13–14, 2018, in conjunction with USENIX Security in Baltimore, MD, USA. WOOT provides a forum for high-quality, peer-reviewed work discussing tools and techniques for attack. Submissions should reflect the state of the art in offensive computer security technology, exposing poorly understood mechanisms, presenting novel attacks, or surveying the state of offensive operations at scale. WOOT '18 welcomes papers in both an academic security context and more applied work that informs the field about the state of security practice in offensive techniques. Topics of interest include, but are not limited to: Application security and vulnerability research Attacks against privacy Attacks on virtualization and the cloud Browser and general client-side security Hardware attacks Internet of Things Malware design, implementation, and analysis Network and distributed systems attacks Offensive applications of formal methods Offensive aspects of mobile security Offensive technologies using (or against) machine learning Operating systems security Practical attacks on deployed cryptographic systems Paper submissions are due by Wednesday, May 30, 2018. Please read through the complete Call for Papers for additional details and instructions: https://ift.tt/2uZNmKd We look forward to receiving your submissions! Christian Rossow, CISPA Yves Younan, Cisco Talos WOOT '18 Program Co-Chairs

Source: Gmail -> IFTTT-> Blogger

[FD] Strong Password Generator - Biased Randomness

Hi list! I am posting my findings here in hopes some may find it interesting, and to provide yet another example of why not to trust browser extensions blindly. This is the top chrome web store search result for "password generator". It has 35,000 users. This was hardly a difficult bug to find, taking just a few minutes, but I was unable to find it online anywhere else. https://ift.tt/1cC1i8h This password generator contains a logic flaw, which results in some characters appearing more frequently than others in generated passwords. On 2018-03-27, I reported this issue using the "support" section of the extension's page on the chrome web store. On 2018-04-10, I checked back again to discover that the support section for this extension has been switched off. This extension was last updated over 5 years ago. I have concluded that the developer isn't interested in it anymore. In fairness, despite this flaw, this password generator is still probably better than no password generator at all.

Source: Gmail -> IFTTT-> Blogger

[FD] Defense in depth -- the Microsoft way (part 53): our MSRC doesn't know how Windows handles PATH

[FD] KETAMINE: Multiple vulnerabilities in SecureRandom(), numerous cryptocurrency products affected.

A significant number of past and current cryptocurrency products contain a JavaScript class named SecureRandom(), containing both entropy collection and a PRNG. The entropy collection and the RNG itself are both deficient to the degree that key material can be recovered by a third party with medium complexity. There are a substantial number of variations of this SecureRandom() class in various pieces of software, some with bugs fixed, some with additional bugs added. Products that aren't today vulnerable due to moving to other libraries may be using old keys that have been previously compromised by usage of SecureRandom(). The most common variations of the library attempts to collect entropy from window.crypto's CSPRNG, but due to a type error in a comparison this function is silently stepped over without failing. Entropy is subsequently gathered from math.Random (a 48bit linear congruential generator, seeded by the time in some browsers), and a single execution of a medium resolution timer. In some known configurations this system has substantially less than 48 bits of entropy. The core of the RNG is an implementation of RC4 ("arcfour random"), and the output is often directly used for the creation of private key material as well as cryptographic nonces for ECDSA signatures. RC4 is publicly known to have biases of several bits, which are likely sufficient for a lattice solver to recover a ECDSA private key given a number of signatures. One popular Bitcoin web wallet re-initialized the RC4 state for every signature which makes the biases bit-aligned, but in other cases the Special K would be manifest itself over multiple transactions. Necessary action: * identify and move all funds stored using SecureRandom() * rotate all key material generated by, or has come into contact with any piece of software using SecureRandom() * do not write cryptographic tools in non-type safe languages * don't take the output of a CSPRNG and pass it through RC4 - 3CJ99vSipFi9z11UdbdZWfNKjywJnY8sT8

Source: Gmail -> IFTTT-> Blogger

Hackers Found Using A New Code Injection Technique to Evade Detection

While performing in-depth analysis of various malware samples, security researchers at Cyberbit found a new code injection technique, dubbed Early Bird, being used by at least three different sophisticated malware that helped attackers evade detection. As its name suggests, Early Bird is a "simple yet powerful" technique that allows attackers to inject malicious code into a legitimate process


from The Hacker News https://ift.tt/2EJ7H5S
via IFTTT

[FD] DSA-2018-071: Dell EMC ViPR Controller Information Exposure Vulnerability

-----BEGIN PGP SIGNED MESSAGE-

Source: Gmail -> IFTTT-> Blogger

[FD] Microsoft account site using old cert

Hi,   On21-February-2018 I send the following email to secure@microsoft.com. Onthe same day I received back a (probably automated) response email that a casewas opened, with a case number.   Today,13-Apri-2018 it happened again, so I share it. Seeit also at a SSL Labs test report at https://ift.tt/2HyHryt   Cheers!   Emailsubject: old cert Emailbody: " Higuys,   Whilelogging out of my Microsoft Account, on the way out I was redirected via https://ift.tt/2GYR2NU its cert expired on the 10th of May 2017… so I got a browserSSL error.   Shameon you… for your cert management… Attachedthe cert in base64 .cer format and a sample screen shot.   Thecert in base64:

Source: Gmail -> IFTTT-> Blogger

ISS Daily Summary Report – 4/12/2018

Plant Gravity Perception (PGP): The final Plant Gravity Perception experiment completed today with three of the Experiment Containers (ECs) maintaining power in the European Modular Cultivation System (EMCS).  The crew removed seed cassettes from the ECs and placed them in a Minus Eighty Degree Celsius Laboratory Freezer for ISS (MELFI).  For this investigation, normal and … Continue reading "ISS Daily Summary Report – 4/12/2018"

from ISS On-Orbit Status Report https://ift.tt/2H02I2H
via IFTTT

Popular Android Phone Manufacturers Caught Lying About Security Updates

Android ecosystem is highly broken when it comes to security, and device manufacturers (better known as OEMs) make it even worse by not providing critical patches in time. According to a new study, most Android vendors have been lying to users about security updates and telling customers that their smartphones are running the latest updates. In other words, most smartphone manufacturers


from The Hacker News https://ift.tt/2GWrepI
via IFTTT

13 Reasons Why Cast Reads Personal Letters

To celebrate the one-year anniversary of hit drama 13 Reasons Why, and the show's incredible fans, Netflix released videos of the cast reading letters from real-life survivors of bullying and sexual assault relating their experiences and hope. In anticipation of season two, Netflix has also announced that it ...

from Google Alert - anonymous https://ift.tt/2GTuIoH
via IFTTT

anonymous

anonymous. Manuscripts: A. 2 (University Library, Cambridge University) · A. 3 (University Library, Cambridge University) · Add. 1080 (University Library, Cambridge University) · Add. 1084 (University Library, Cambridge University) · Add. 1086 (University Library, Cambridge University) · Add. 1095 ...

from Google Alert - anonymous https://ift.tt/2qtRnRL
via IFTTT

Thursday, April 12, 2018

Promise Campaign Launches Publicly With $100 Million Anonymous Gift

Promise Campaign Launches Publicly With $100 Million Anonymous Gift. Effort seeks to raise $625 million over five years to support student aid, grow the faculty and promote the liberal arts.

from Google Alert - anonymous https://ift.tt/2HvZK7d
via IFTTT

About Us



from Google Alert - anonymous https://ift.tt/2IPZvTQ
via IFTTT

Hacker Can Steal Data from Air-Gapped Computers through Power Lines

Do you think it is possible to extract data from a computer using its power cables? If no, then you should definitely read about this technique. Researchers from Israel's Ben Gurion University of the Negev—who majorly focus on finding clever ways to exfiltrate data from an isolated or air-gapped computer—have now shown how fluctuations in the current flow "propagated through the power lines"


from The Hacker News https://ift.tt/2qrMNmT
via IFTTT

ISS Daily Summary Report – 4/11/2018

Metabolic Tracking (MT): The crew set up MT hardware and materials for thawing and inoculation. They then injected the thawed inoculum into multiwell BioCells, which were inserted into the NanoRacks Plate Reader-2.  NanoRacks Plate Reader-2 is a laboratory instrument designed to detect biological, chemical or physical events of samples in microtiter plates. The Metabolic Tracking … Continue reading "ISS Daily Summary Report – 4/11/2018"

from ISS On-Orbit Status Report https://ift.tt/2JFc7OV
via IFTTT

I have a new follower on Twitter


Zoie Hackett

Minneapolis, MN

Following: 2595 - Followers: 1804

April 12, 2018 at 09:32AM via Twitter http://twitter.com/zoie_hackett

Emotional Wellness Talk: The Anonymous People (Film Screening)

Join us for a screening and conversation about 'The Anonymous People,' a feature documentary film about the over 23 million Americans living in long-term recovery from addiction to alcohol and other drugs. For more information visit, umbsextalk.com. For Disability-Related Accommodations, including ...

from Google Alert - anonymous https://ift.tt/2Hklrtl
via IFTTT

Flaw in Microsoft Outlook Lets Hackers Easily Steal Your Windows Password

A security researcher has disclosed details of an important vulnerability in Microsoft Outlook for which the company released an incomplete patch this month—almost 18 months after receiving the responsible disclosure report. The Microsoft Outlook vulnerability (CVE-2018-0950) could allow attackers to steal sensitive information, including users' Windows login credentials, just by convincing


from The Hacker News https://ift.tt/2HuaLWL
via IFTTT

M22 and the Wanderers


Wandering through the constellation Sagittarius, bright planets Mars and Saturn appeared together in early morning skies over the last weeks. They are captured in this 3 degree wide field-of-view from March 31 in a close celestial triangle with large globular star cluster Messier 22. Of course M22 (bottom left) is about 10,000 light-years distant, a massive ball of over 100,000 stars much older than our Sun. Pale yellow and shining by reflected sunlight, Saturn (on top) is about 82 light-minutes away. Look carefully and you can spot large moon Titan as a pinpoint of light at about the 5 o'clock position in the glare of Saturn's overexposed disk. Slightly brighter and redder Mars is 9 light-minutes distant. While both planets are moving on toward upcoming oppositions, by July Mars will become much brighter still, with good telescopic views near its 2018 opposition a mere 3.2 light-minutes from planet Earth. via NASA https://ift.tt/2qrEM27

Wednesday, April 11, 2018

2018 NFL Preseason Schedule Release: Ravens open against the Bears in Hall of Fame Game (ESPN)

from ESPN https://ift.tt/17lH5T2
via IFTTT

Ravens: New QB Robert Griffin III a "smarter player" after studying game during year off (ESPN)

from ESPN https://ift.tt/17lH5T2
via IFTTT

🔊 Orioles Interview: C Caleb Joseph walks you through the thought process of his recent web gems (ESPN)

from ESPN https://ift.tt/1eW1vUH
via IFTTT

Final draft: Ozzie Newsome puts last touches on his historic Ravens legacy - Jamison Hensley (ESPN)

from ESPN https://ift.tt/17lH5T2
via IFTTT

Ravens selecting a duo of first-ballot Hall of Famers in 1996 among NFL's most memorable draft moments - Jamison Hensley (ESPN)

from ESPN https://ift.tt/17lH5T2
via IFTTT

ISS Daily Summary Report – 4/10/2018

Human Research Program (HRP) Collections (Biochemical Profile and Repository):  A 54S crewmember collected urine samples for his FD15 sessions of the Biochem Profile and Repository investigations. The Biochemical Profile experiment tests blood and urine samples obtained from astronauts before, during, and after spaceflight. Specific proteins and chemicals in the samples are used as biomarkers, or … Continue reading "ISS Daily Summary Report – 4/10/2018"

from ISS On-Orbit Status Report https://ift.tt/2qo0WRY
via IFTTT

I have a new follower on Twitter


Max Foundry
We build https://t.co/ImWHxDBZAS, https://t.co/peOxO6YNd6 and https://t.co/xhj0WhDWYz
Columbus & San Francisco
http://t.co/dEwhk3iuKQ
Following: 13189 - Followers: 16152

April 11, 2018 at 05:47AM via Twitter http://twitter.com/MaxFoundry

Tuesday, April 10, 2018

Anonymous Tip



from Google Alert - anonymous https://ift.tt/2H8Ed71
via IFTTT

Warning: Your Windows PC Can Get Hacked by Just Visiting a Site

Can you get hacked just by clicking on a malicious link or opening a website? — YES. Microsoft has just released its April month's Patch Tuesday security updates, which addresses multiple critical vulnerabilities in its Windows operating systems and other products, five of which could allow an attacker to hack your computer by just tricking you visit a website. Microsoft has patched five


from The Hacker News https://ift.tt/2v4ULrx
via IFTTT

Facebook Offering $40,000 Bounty If You Find Evidence Of Data Leaks

Facebook pays millions of dollars every year to researchers and bug hunters to stamp out security holes in its products and infrastructure, but following Cambridge Analytica scandal, the company today launched a bounty program to reward users for reporting "data abuse" on its platform. The move comes as Facebook CEO Mark Zuckerberg prepares to testify before Congress this week amid scrutiny


from The Hacker News https://ift.tt/2HopLW4
via IFTTT

[FD] secuvera-SA-2017-04: SQL-Injection Vulnerability in OCS Inventory NG ocsreports Web application

[FD] secuvera-SA-2017-03: Reflected Cross-Site-Scripting Vulnerabilities in OCS Inventory NG ocsreports Web application

Re: [FD] Shenzhen TVT Digital Technology Co. Ltd & OEM {DVR/NVR/IPC} API RCE

Missing in timeline: April 3, 2018: Vendor released advisory https://ift.tt/2EALTcS

Source: Gmail -> IFTTT-> Blogger

Re: [FD] new email; gw22067@hotmail.com | Double-free segfault bypass

Flaw in Emergency Alert Systems Could Allow Hackers to Trigger False Alarms

A serious vulnerability has been exposed in "emergency alert systems" that could be exploited remotely via radio frequencies to activate all the sirens, allowing hackers to trigger false alarms. The emergency alert sirens are used worldwide to alert citizens about natural disasters, man-made disasters, and emergency situations, such as dangerous weather conditions, severe storms, tornadoes


from The Hacker News https://ift.tt/2GQWcv8
via IFTTT

[FD] WP Image Zoom allows anybody to cause denial of service (WordPress plugin)

Details ================ Software: WP Image Zoom Version: 1.23 Homepage: https://ift.tt/2qkx7m3 Advisory report: https://ift.tt/2GNuO19 CVE: Awaiting assignment CVSS: 7.5 (High; AV:N/AC:L/Au:S/C:N/I:P/A:C) Description ================ WP Image Zoom allows anybody to cause denial of service Vulnerability ================ WP Image Zoom includes an AJAX action which allows any logged in user to set any option to “1”. This means that any logged in user can cause a denial of service for all WP URLs by setting the “template” option to “1”. Additionally, this vulnerability can be triggered via CSRF meaning that anybody who can convince a logged in user to follow a link can also cause a denial of service. Proof of concept ================ Press the submit button in the following HTML snippet:
This will set the template option to 1 causing fatal errors for any WordPress URL. In a real attack the form could be set to autosubmit so no user interaction is required except for following a link. Mitigations ================ Upgrade to version 1.24 or later. Disclosure policy ================ dxw believes in responsible disclosure. Your attention is drawn to our disclosure policy: https://ift.tt/1B6NWzd Please contact us on security@dxw.com to acknowledge this report if you received it via a third party (for example, plugins@wordpress.org) as they generally cannot communicate with us on your behalf. This vulnerability will be published if we do not receive a response to this report with 14 days. Timeline ================ 2018-03-20: Discovered 2018-03-27: Reported to author via https://ift.tt/2qjSjZu 2018-03-27: Vendor responded 2018-03-29: Vendor reported issue fixed in version 1.24 Discovered by dxw: ================ Tom Adams Please visit security.dxw.com for more information.

Source: Gmail -> IFTTT-> Blogger

[FD] Rating-Widget: Star Review System allows anybody to turn on debug mode and view errors and warnings (WordPress plugin)

Details ================ Software: Rating-Widget: Star Review System Version: 2.8.9 Homepage: https://ift.tt/1njiSoz Advisory report: https://ift.tt/2Ex2Q81 CVE: Awaiting assignment CVSS: 5 (Medium; AV:N/AC:L/Au:N/C:P/I:N/A:N) Description ================ Rating-Widget: Star Review System allows anybody to turn on debug mode and view errors and warnings Vulnerability ================ The plugin allows anybody to turn on debug mode and view errors and warnings. Errors and warnings should be turned off on production sites as they reveal information useful to attackers such as paths, and may give hints as to how themes and plugins are written. Proof of concept ================ Add 1/0; to functions.php in the theme Enable this plugin Visit http://localhost/?rwdbg=true (You may need to view source, depending on the theme) You will see a PHP warning, including the path to your functions.php file Mitigations ================ Upgrade to version 2.9.0 or later. Disclosure policy ================ dxw believes in responsible disclosure. Your attention is drawn to our disclosure policy: https://ift.tt/1B6NWzd Please contact us on security@dxw.com to acknowledge this report if you received it via a third party (for example, plugins@wordpress.org) as they generally cannot communicate with us on your behalf. This vulnerability will be published if we do not receive a response to this report with 14 days. Timeline ================ 2017-10-30: Discovered 2017-11-02: Reported to vendor via email 2017-11-03: Vendor reports it will be fixed in the next release 2017-12-12: Vendor reports issue fixed Discovered by dxw: ================ Tom Adams Please visit security.dxw.com for more information.

Source: Gmail -> IFTTT-> Blogger

ISS Daily Summary Report – 4/09/2018

Plant Gravity Perception (PGP): The crew began the final Plant Gravity Perception experiment run on Saturday with three of the Experiment Containers maintaining power. For this investigation, normal and mutated forms of thale cress, a model research plant, are germinated to support the study of the plants’ gravity and light perception. Results provide insight into … Continue reading "ISS Daily Summary Report – 4/09/2018"

from ISS On-Orbit Status Report https://ift.tt/2H7xtpZ
via IFTTT

[FD] Like Button Rating ♥ LikeBtn allows anybody to set any option (WordPress plugin)

Details ================ Software: Like Button Rating ♥ LikeBtn Version: 2.5.3 Homepage: https://ift.tt/1sqIK9v Advisory report: https://ift.tt/2uYdrcu CVE: Awaiting assignment CVSS: 6.4 (Medium; AV:N/AC:L/Au:N/C:P/I:P/A:N) Description ================ Like Button Rating ♥ LikeBtn allows anybody to set any option Vulnerability ================ In the init action, this plugin checks to see if $_POST[\'likebtn_import_config\'] is empty. If it’s not empty then it base64-decodes the string, parses it as JSON, and starts changing options. Proof of concept ================ The below form will set the “Site Title” option to “Temmie”:
This works whether you’re logged in or not. The base64-encoded JSON above is this: { \"likebtn_settings_options\": { \"blogname\": \"Temmie\" } } Mitigations ================ Upgrade to version 2.5.4 or later. Disclosure policy ================ dxw believes in responsible disclosure. Your attention is drawn to our disclosure policy: https://ift.tt/1B6NWzd Please contact us on security@dxw.com to acknowledge this report if you received it via a third party (for example, plugins@wordpress.org) as they generally cannot communicate with us on your behalf. This vulnerability will be published if we do not receive a response to this report with 14 days. Timeline ================ 2017-10-27: Discovered 2017-11-02: Reported to vendor via email 2017-11-02: Vendor reported fixed Discovered by dxw: ================ Tom Adams Please visit security.dxw.com for more information.

Source: Gmail -> IFTTT-> Blogger

[FD] SQLi in Relevanssi might allow an admin to read contents of database (WordPress plugin)

Details ================ Software: Relevanssi Version: 3.5.12,3.6.0 Homepage: https://ift.tt/1amOKG7 Advisory report: https://ift.tt/2HnE8tB CVE: Awaiting assignment CVSS: 8.5 (High; AV:N/AC:L/Au:S/C:C/I:C/A:N) Description ================ SQLi in Relevanssi might allow an admin to read contents of database Vulnerability ================ If logged in as an administrator on any site, you can go to Settings > Relevanssi Premium and potentially extract all values in the database including password hashes and user activation tokens. This is achieved by a SQL injection based on the fact that some of the configuration options are appended into a SQL query in a unsafe way. The configuration page contains the following code, that allows the user to set the option relevanssi_post_type_weights to any value: function update_relevanssi_options() { .... foreach ($_REQUEST as $key => $value) { if (substr($key, 0, strlen(\'relevanssi_weight_\')) == \'relevanssi_weight_\') { $type = substr($key, strlen(\'relevanssi_weight_\')); $post_type_weights[$type] = $value; } .... } if (count($post_type_weights) > 0) { update_option(\'relevanssi_post_type_weights\', $post_type_weights); } .... } Now when a search is made, the function relevanssi_search is called, this appends the user controled value into the SQL query: $post_type_weights = get_option(\'relevanssi_post_type_weights\'); ... !empty($post_type_weights[\'post_tag\']) ? $tag = $post_type_weights[\'post_tag\'] : $tag = $relevanssi_variables[\'post_type_weight_defaults\'][\'post_tag\']; !empty($post_type_weights[\'category\']) ? $cat = $post_type_weights[\'category\'] : $cat = $relevanssi_variables[\'post_type_weight_defaults\'][\'category\']; $query = \"SELECT relevanssi.*, relevanssi.title * $title_boost + relevanssi.content + relevanssi.comment * $comment_boost + relevanssi.tag * $tag + relevanssi.link * $link_boost + relevanssi.author + relevanssi.category * $cat + relevanssi.excerpt + relevanssi.taxonomy + relevanssi.customfield + relevanssi.mysqlcolumn AS tf FROM $relevanssi_table AS relevanssi $query_join WHERE $term_cond $query_restrictions\"; Proof of concept ================ Visit /wp-admin/options-general.php?page=relevanssi%2Frelevanssi.php Assign 0.75\' (the default value plus a single quote at the end) to “Tag weight” Press “Save the options” Visit /?s=test The logs should contain the following error because of the syntax error we introduced: “WordPress database error You have an error in your SQL syntax” Note: while it’s possible to inject syntax errors, it’s currently unknown whether this bug can be used to inject anything that would be useful to an attacker. Mitigations ================ Upgrade to version 3.6.1 or later. Disclosure policy ================ dxw believes in responsible disclosure. Your attention is drawn to our disclosure policy: https://ift.tt/1B6NWzd Please contact us on security@dxw.com to acknowledge this report if you received it via a third party (for example, plugins@wordpress.org) as they generally cannot communicate with us on your behalf. This vulnerability will be published if we do not receive a response to this report with 14 days. Timeline ================ 2017-10-02: Discovered 2017-10-02: Reported to vendor via email 2017-10-03: First response from vendor 2017-10-03: Version 3.6.1 released which contains a fix for this bug Discovered by dxw: ================ Glyn Wintle Please visit security.dxw.com for more information.

Source: Gmail -> IFTTT-> Blogger

8th St.'s surf is at least 5.8ft high

Maryland-Delaware, April 16, 2018 at 04:00AM

8th St. Summary
At 4:00 AM, surf min of 5.8ft. At 10:00 AM, surf min of 4.86ft. At 4:00 PM, surf min of 3.95ft. At 10:00 PM, surf min of 3.08ft.

Surf maximum: 6.51ft (1.98m)
Surf minimum: 5.8ft (1.77m)
Tide height: 0.4ft (0.12m)
Wind direction: S
Wind speed: 19.48 KTS


from Surfline https://ift.tt/1kVmigH
via IFTTT

How to Find Out Everything Facebook Knows About You

Facebook CEO Mark Zuckerberg will testify before Congress this week to answer questions from lawmakers in two separate congressional committees, to explain how his company collects and handles users' personal information. The past few weeks have been difficult for Facebook over concerns that the data of millions of users has been breached. Facebook stores details of almost every action you


from The Hacker News https://ift.tt/2JwxQsB
via IFTTT

Monday, April 9, 2018

Critical Code Execution Flaw Found in CyberArk Enterprise Password Vault

A critical remote code execution vulnerability has been discovered in CyberArk Enterprise Password Vault application that could allow an attacker to gain unauthorized access to the system with the privileges of the web application. Enterprise password manager (EPV) solutions help organizations securely manage their sensitive passwords, controlling privileged accounts passwords across a wide


from The Hacker News https://ift.tt/2GKDdCv
via IFTTT

How to (quickly) build a deep learning image dataset

An example of a Pokedex (thank you to Game Trader USA for the Pokedex template!)

When I was a kid, I was a huge Pokemon nerd. I collected the trading cards, played the Game Boy games, and watched the TV show. If it involved Pokemon, I was probably interested in it.

Pokemon made a lasting impression on me — and looking back, Pokemon may have even inspired me to study computer vision.

You see, in the very first episode of the show (and in the first few minutes of the game), the protagonist, Ash Ketchum, was given a special electronic device called a Pokedex.

A Pokedex is used to catalogue and provide information regarding species of Pokemon encounters Ash along his travels. You can think of the Pokedex as a “Pokemon Encyclopedia” of sorts.

When stumbling upon a new species of Pokemon Ash had not seen before, he would hold the Pokedex up to the Pokemon and then the Pokedex would automatically identify it for him, presumably via some sort of camera sensor (similar to the image at the top of this post).

In essence, the Pokedex was acting like a smartphone app that utilized computer vision!

We can imagine a similar app on our iPhone or Android today, where:

  1. We open the “Pokedex” app on our phone
  2. The app accesses our camera
  3. We snap a photo of the Pokemon
  4. And then the app automatically identifies the Pokemon

As a kid, I always thought the Pokedex was so cool…

…and now I’m going to build one.

In this three-part blog post series we’re going to build our very own Pokedex:

  1. We’ll start today by using the Bing Image Search API to (easily) build our image dataset of Pokemon.
  2. Next week, I’ll demonstrate how to implement and train a CNN using Keras to recognize each Pokemon.
  3. And finally, we’ll use our trained Keras model and deploy it to an iPhone app (or at the very least a Raspberry Pi — I’m still working out the kinks in the iPhone deployment).

By the end of the series we’ll have a fully functioning Pokedex!

To get started using the Bing Image Search API to build an image dataset for deep learning, just keep reading.

Looking for the source code to this post?
Jump right to the downloads section.

How to (quickly) build a deep learning image dataset

In order to build our deep learning image dataset, we are going to utilize Microsoft’s Bing Image Search API, which is part of Microsoft’s Cognitive Services used to bring AI to vision, speech, text, and more to apps and software.

In a previous blog post, you’ll remember that I demonstrated how you can scrape Google Images to build your own dataset — the problem here is that it’s a tedious, manual process.

Instead, I was looking for a solution that would enable me to programmatically download images via a query.

I did not want to have to open my browser or utilize browser extensions to download the image files from my search.

Many years ago Google deprecated its own image search API (which is the reason we need to scrape Google Images in the first place).

A few months ago I decided to give Microsoft’s Bing Image Search API a try. I was incredibly pleased.

The results were relevant and the API was easy to consume.

It also includes a free 30-day trial as well, after which the API seems reasonably priced (I haven’t converted to a paying customer yet but I probably will given the pleasant experience).

In the remainder of today’s blog post, I’ll be demonstrating how we can leverage the Bing Image Search API to quickly build an image dataset suitable for deep learning.

Creating your Cognitive Services account

In this section, I’ll provide a short walkthrough of how to get your (free) Bing Image Search API account.

The actual registration process is straightforward; however, finding the actual page that kicks off the registration process is a bit confusing — it’s my primary critique of the service.

To get started, head to the Bing Image Search API page:

Figure 1: We can use the Microsoft Bing Search API to download images for a deep learning dataset.

As we can see from the screenshot, the trial includes all of Bing’s search APIs with a total of 3,000 transactions per month — this will be more than sufficient to play around and build our first image-based deep learning dataset.

To register for the Bing Image Search API, click the “Get API Key” button.

From there you’ll be able to register by logging in with your Microsoft, Facebook, LinkedIn, or GitHub account (I went with GitHub for simplicity).

After you finish the registration process you’ll end up on the Your APIs page which should look similar to my browser below:

Figure 2: The Microsoft Bing API endpoints along with my API keys which I need in order to use the API.

Here you can see my list of Bing search endpoints, including my two API keys (blurred out for obvious reasons).

Make note of your API key as you’ll need it in the next section.

Building a deep learning dataset with Python

Now that we have registered for the Bing Image Search API, we are ready to build our deep learning dataset.

Read the docs

Before continuing, I would recommend opening up the following two Bing Image Search API documentation pages in your browser:

You should reference these two pages if you have any questions on either (1) how the API works or (2) how we are consuming the API after making a search request.

Install the requests package

If you do not already have

requests
  installed on your system, you can install it via:
$ pip install requests

The

requests
  package makes it super easy for us to make HTTP requests and not get bogged down in fighting with Python to gracefully handle requests.

Additionally, if you are using Python virtual environments make sure you use the

workon
  command to access the environment before installing
requests
 :
$ workon your_env_name
$ pip install requests

Create your Python script to download images

Let’s go ahead and get started coding.

Open up a new file, name it

search_bing_api.py
 , and insert the following code:
# import the necessary packages
from requests import exceptions
import argparse
import requests
import cv2
import os

# construct the argument parser and parse the arguments
ap = argparse.ArgumentParser()
ap.add_argument("-q", "--query", required=True,
        help="search query to search Bing Image API for")
ap.add_argument("-o", "--output", required=True,
        help="path to output directory of images")
args = vars(ap.parse_args())

Lines 2-6 handle importing the packages necessary for this script. You’ll need OpenCV and requests installed in your virtual environment. To set up OpenCV on your system, just follow the relevant guide for your system here.

Next, we parse two command line arguments:

  • --query:
    
      The image search query you’re using, which could be anything such as “pikachu”, “santa” or “jurassic park”.
  • --output:
    
      The output directory for your images. My personal preference (for the sake of organization and sanity) is to separate your images into separate class subdirectories, so be sure to specify the correct folder that you’d like your images to go into (shown below in the “Downloading images for training a deep neural network” section).

You do not need to modify the command line arguments section of this script (Lines 9-14). These are inputs you give the script at runtime. To learn how to properly use command line arguments, see my recent blog post.

From there, let’s configure some global variables:

# set your Microsoft Cognitive Services API key along with (1) the
# maximum number of results for a given search and (2) the group size
# for results (maximum of 50 per request)
API_KEY = "YOUR_API_KEY_GOES_HERE"
MAX_RESULTS = 250
GROUP_SIZE = 50

# set the endpoint API URL
URL = "https://api.cognitive.microsoft.com/bing/v7.0/images/search"

The one part of this script that you must modify is the

API_KEY
 . You can grab an API key by logging into Microsoft Cognitive Services and selecting the service you’d like to use (as shown above where you need to click the “Get API Key” button). From there, simply paste the API key within the quotes for this variable.

You can also modify

MAX_RESULTS
  and
GROUP_SIZE
  for your search. Here, I’m limiting my results to the first
250
  images and returning the maximum number of images per request by the Bing API (
50
  total images).

You can think of the

GROUP_SIZE
  parameter as the number of search results to return “per page”. Therefore, if we would like a total of 250 images, we would need to go through 5 “pages” with 50 images “per page”.

When training a Convolutional Neural Network, I would ieally like to have ~1,000 images per class but this is just an example. Feel free to download as many images as you would like, just be mindful:

  1. That all images you download should still be relevant to the query.
  2. You don’t bump up against the limits of Bing’s free API tier (otherwise you’ll need to start paying for the service).

From there, let’s make sure that we are prepared to handle all (edit: most) of the possible exceptions that can arise when trying to fetch an image by first making a list of the exceptions we may encounter:

# when attempting to download images from the web both the Python
# programming language and the requests library have a number of
# exceptions that can be thrown so let's build a list of them now
# so we can filter on them
EXCEPTIONS = set([IOError, FileNotFoundError,
        exceptions.RequestException, exceptions.HTTPError,
        exceptions.ConnectionError, exceptions.Timeout])

When working with network requests there are a number of exceptions that can be thrown, so we list them on Lines 30-32. We’ll try to catch them and handle them gracefully later.

From there, let’s initialize our search parameters and make the search:

# store the search term in a convenience variable then set the
# headers and search parameters
term = args["query"]
headers = {"Ocp-Apim-Subscription-Key" : API_KEY}
params = {"q": term, "offset": 0, "count": GROUP_SIZE}

# make the search
print("[INFO] searching Bing API for '{}'".format(term))
search = requests.get(URL, headers=headers, params=params)
search.raise_for_status()

# grab the results from the search, including the total number of
# estimated results returned by the Bing API
results = search.json()
estNumResults = min(results["totalEstimatedMatches"], MAX_RESULTS)
print("[INFO] {} total results for '{}'".format(estNumResults,
        term))

# initialize the total number of images downloaded thus far
total = 0

On Lines 36-38, we initialize the search parameters. Be sure to review the API documentation as needed.

From there, we perform the search (Lines 42-43) and grab the results in JSON format (Line 47).

We calculate and print the estimated number of results to the terminal next (Lines 48-50).

We’ll be keeping a counter of the images downloaded as we go, so I initialize

total
  on Line 53.

Now it’s time to loop over the results in

GROUP_SIZE
  chunks:
# loop over the estimated number of results in `GROUP_SIZE` groups
for offset in range(0, estNumResults, GROUP_SIZE):
        # update the search parameters using the current offset, then
        # make the request to fetch the results
        print("[INFO] making request for group {}-{} of {}...".format(
                offset, offset + GROUP_SIZE, estNumResults))
        params["offset"] = offset
        search = requests.get(URL, headers=headers, params=params)
        search.raise_for_status()
        results = search.json()
        print("[INFO] saving images for group {}-{} of {}...".format(
                offset, offset + GROUP_SIZE, estNumResults))

Here we are looping over the estimated number of results in

GROUP_SIZE
  batches as that is what the API allows (Line 56).

The current

offset
  is passed as a parameter when we call
requests.get
  to grab the JSON blob (Line 62).

From there, let’s try to save the images in the current batch:

# loop over the results
        for v in results["value"]:
                # try to download the image
                try:
                        # make a request to download the image
                        print("[INFO] fetching: {}".format(v["contentUrl"]))
                        r = requests.get(v["contentUrl"], timeout=30)

                        # build the path to the output image
                        ext = v["contentUrl"][v["contentUrl"].rfind("."):]
                        p = os.path.sep.join([args["output"], "{}{}".format(
                                str(total).zfill(8), ext)])

                        # write the image to disk
                        f = open(p, "wb")
                        f.write(r.content)
                        f.close()

                # catch any errors that would not unable us to download the
                # image
                except Exception as e:
                        # check to see if our exception is in our list of
                        # exceptions to check for
                        if type(e) in EXCEPTIONS:
                                print("[INFO] skipping: {}".format(v["contentUrl"]))
                                continue

Here we’re going to loop over the current batch of images and attempt to download each individual image to our output folder.

We establish a try-catch block so that we can catch the possible

EXCEPTIONS
  which we defined earlier in the script. If we encounter an exception we’ll be skipping that particular image and moving forward (Line 71 and Lines 88-93).

Inside of the

try
  block, we attempt to fetch the image by URL (Line 74), and build a path + filename for it (Lines 77-79).

We then try to open and write the file to disk (Lines 82-84). It’s worth noting here that we’re creating a binary file object denoted by the

b
  in
"wb"
 . We access the binary data via
r.content
 .

Next, let’s see if the image can actually be loaded by OpenCV which would imply (1) that the image file was downloaded successfully and (2) the image path is valid:

# try to load the image from disk
                image = cv2.imread(p)

                # if the image is `None` then we could not properly load the
                # image from disk (so it should be ignored)
                if image is None:
                        print("[INFO] deleting: {}".format(p))
                        os.remove(p)
                        continue

                # update the counter
                total += 1

In this block, we load the image file on Line 96.

As long as the

image
  data is not
None
 , we update our
total
  counter and loop back to the top.

Otherwise, we call

os.remove
  to delete the invalid image and we continue back to the top of the loop without updating our counter. The if-statement on Line 100 could trigger due to network errors when downloading the file, not having the proper image I/O libraries installed, etc. If you’re interested in learning more about
NoneType
  errors in OpenCV and Python, refer to this blog post.

Downloading images for training a deep neural network

Figure 3: The Bing Image Search API is so easy to use that I love it as much as I love Pikachu!

Now that we have our script coded up, let’s download images for our deep learning dataset using Bing’s Image Search API.

Make sure you use the “Downloads” section of this guide to download the code and example directory structure.

In my case, I am creating a

dataset
  directory:
$ mkdir dataset

All images downloaded will be stored in

dataset
 . From there, execute the following commands to make a subdirectory and run the search for “charmander”:
$ mkdir dataset/charmander
$ python search_bing_api.py --query "charmander" --output dataset/charmander
[INFO] searching Bing API for 'charmander'
[INFO] 250 total results for 'charmander'
[INFO] making request for group 0-50 of 250...
[INFO] saving images for group 0-50 of 250...
[INFO] fetching: https://fc06.deviantart.net/fs70/i/2012/355/8/2/0004_c___charmander_by_gaghiel1987-d5oqbts.png
[INFO] fetching: https://th03.deviantart.net/fs71/PRE/f/2010/067/5/d/Charmander_by_Woodsman819.jpg
[INFO] fetching: https://fc05.deviantart.net/fs70/f/2011/120/8/6/pokemon___charmander_by_lilnutta10-d2vr4ov.jpg
...
[INFO] making request for group 50-100 of 250...
[INFO] saving images for group 50-100 of 250...
...
[INFO] fetching: https://38.media.tumblr.com/f0fdd67a86bc3eee31a5fd16a44c07af/tumblr_nbhf2vTtSH1qc9mvbo1_500.gif
[INFO] deleting: dataset/charmander/00000174.gif
...

As I mentioned in the introduction of this post, we are downloading images of Pokemon to be used when building a Pokedex (a special device to recognize Pokemon in real-time).

In the above command I am downloading images of Charmander, a popular Pokemon. Most of the 250 images will successfully download, but as shown in the output above, there will be a few that aren’t able to be opened by OpenCV and will be deleted.

I do the same for Pikachu:

$ mkdir dataset/pikachu
$ python search_bing_api.py --query "pikachu" --output dataset/pikachu
[INFO] searching Bing API for 'pikachu'
[INFO] 250 total results for 'pikachu'
[INFO] making request for group 0-50 of 250...
[INFO] saving images for group 0-50 of 250...
[INFO] fetching: http://www.mcmbuzz.com/wp-content/uploads/2014/07/025Pikachu_OS_anime_4.png
[INFO] fetching: http://images4.fanpop.com/image/photos/23300000/Pikachu-pikachu-23385603-814-982.jpg
[INFO] fetching: http://images6.fanpop.com/image/photos/33000000/pikachu-pikachu-33005706-895-1000.png
...

Along with Squirtle:

$ mkdir dataset/squirtle
$ python search_bing_api.py --query "squirtle" --output dataset/squirtle
[INFO] searching Bing API for 'squirtle'
[INFO] 250 total results for 'squirtle'
[INFO] making request for group 0-50 of 250...
[INFO] saving images for group 0-50 of 250...
[INFO] fetching: https://fc03.deviantart.net/fs71/i/2013/082/1/3/007_squirtle_by_pklucario-d5z1gj5.png
[INFO] fetching: https://fc03.deviantart.net/fs70/i/2012/035/b/2/squirtle_by_maii1234-d4oo1aq.jpg
[INFO] fetching: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhfFYDbctaPLppgG-XHpJMV7dGUeOZhfiT52unaSaY9Y-s_gm82g_2S9dKxp1bCQYfel2qGnfi0dIDMl0rDKADd-ky5daBFtdUHRQyeuynzEAuOXIBAtFgOcG5DFpiMSqMtl8eBCLbkJWk0/s1600/Leo%2527s+Squirtle.jpg
...

Then Bulbasaur:

$ mkdir dataset/bulbasaur
$ python search_bing_api.py --query "bulbasaur" --output dataset/bulbasaur
[INFO] searching Bing API for 'bulbasaur'
[INFO] 250 total results for 'bulbasaur'
[INFO] making request for group 0-50 of 250...
[INFO] saving images for group 0-50 of 250...
[INFO] fetching: https://fc06.deviantart.net/fs51/f/2009/261/3/e/Bulbasaur_by_elfaceitoso.png
[INFO] skipping: https://fc06.deviantart.net/fs51/f/2009/261/3/e/Bulbasaur_by_elfaceitoso.png
[INFO] fetching: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQFI1nFaMsrkXu2iVO9OAFRm7yBw8ca7AsBK-rIbSXTArh7xtdp1UFuOGMRGpUwGHcwlohBNbZu1r7ONCxoaGvilv9lvVBh1J5MkLtDNu94HE4V7Jobwp0BKIv0vnzslbMVsgiUzA7RYvj/s1600/001Bulbasaur+pokemon+firered+leafgreen.png
[INFO] skipping: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQFI1nFaMsrkXu2iVO9OAFRm7yBw8ca7AsBK-rIbSXTArh7xtdp1UFuOGMRGpUwGHcwlohBNbZu1r7ONCxoaGvilv9lvVBh1J5MkLtDNu94HE4V7Jobwp0BKIv0vnzslbMVsgiUzA7RYvj/s1600/001Bulbasaur+pokemon+firered+leafgreen.png
[INFO] fetching: https://fc09.deviantart.net/fs71/i/2012/088/9/6/bulbasaur_by_songokukai-d4gecpp.png
...

And finally Mewtwo:

$ mkdir dataset/mewtwo
$ python search_bing_api.py --query "mewtwo" --output dataset/mewtwo
[INFO] searching Bing API for 'mewtwo'
[INFO] 250 total results for 'mewtwo'
[INFO] making request for group 0-50 of 250...
[INFO] saving images for group 0-50 of 250...
[INFO] fetching: https://sickr.files.wordpress.com/2011/09/mewtwo.jpg
[INFO] fetching: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHL_NRziG8t4f-jhA1f8tkSmnELreH2TS4ooUxx9L0-JNzs2oqPhpbkyb5p4Zyunh2naFTd3xuAz0ngSnw20lDLSMJL1ZTvcbxzl-c9hNA3wVZw05C6WaswCplrfof_Zb579RmExusk1I/s1600/Mewtwo+Pokemon+Wallpapers+3.jpg
[INFO] fetching: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3hM9ruOXIqHAYwYaGK6MrQAquQ_Aa6_h7mk-x-iW3HgJQqUlqicVmau5CjoAicc39u4evMDN2GrbKnxVmsznr_L3Uql_yKarJHis2q0EtPAFmdxQQsd3Qu73vaY9Wt1hiiDyK2z1xix4/s1600/Mewtwo+Pokemon+Wallpapers.jpg
...

We can count the total number of images downloaded per query by using a bit of

find
  magic (thank you to Glenn Jackman on StackOverflow for this great command hack):
$ find . -type d -print0 | while read -d '' -r dir; do
> files=("$dir"/*)
> printf "%5d files in directory %s\n" "${#files[@]}" "$dir"
> done
    2 files in directory .
    5 files in directory ./dataset
  235 files in directory ./dataset/bulbasaur
  245 files in directory ./dataset/charmander
  245 files in directory ./dataset/mewtwo
  238 files in directory ./dataset/pikachu
  230 files in directory ./dataset/squirtle

Here we can see we have approximately 230-245 images per class. Ideally, I would like to have ~1,000 images per class, but for the sake of simplicity in this example and network overhead (for users without a fast/stable internet connection), I only downloaded 250.

Note: If you use that ugly

find
  command often, it would be worth making an alias in your
~/.bashrc
 !

Pruning our deep learning image dataset

However, not every single image we downloaded will be relevant to the query — most will be, but not all of them.

Unfortunately this is the manual intervention step where you need to go through your directories and prune irrelevant images.

On macOS this is actually a pretty quick process.

My workflow involves opening up Finder and then browseing all images in the “Cover Flow” view:

Figure 4: I’m using the macOS “Cover Flow” view in order to quickly flip through images and filter out those that I don’t want in my deep learning dataset.

If an image is not relevant I can move it to the Trash via

cmd + delete
  on my keyboard. Similar shortcuts and tools exist on other operating systems as well.

After pruning the irrelevant images, let’s do another image count:

$ find . -type d -print0 | while read -d '' -r dir; do
> files=("$dir"/*);
> printf "%5d files in directory %s\n" "${#files[@]}" "$dir";
> done
     3 files in directory .
     5 files in directory ./dataset
   234 files in directory ./dataset/bulbasaur
   238 files in directory ./dataset/charmander
   239 files in directory ./dataset/mewtwo
   234 files in directory ./dataset/pikachu
   223 files in directory ./dataset/squirtle

As you can see, I only had to delete a handful of images per class — the Bing Image Search API worked quite well!

Note: You should also consider removing duplicate images as well. I didn’t take this step as there weren’t too many duplicates (except for the “squirtle” class; I have no idea why there were so many duplicates there), but if you’re interested in learning more about how to find duplicates, see this blog post on image hashing.

Summary

In today’s blog post you learned how to quickly build a deep learning image dataset using Microsoft’s Bing Image Search API.

Using the API we were able to programmatically download images for training a deep neural network, a huge step up from having to manually scrape images using Google Images.

The Bing Image Search API is free to use for 30 days which is perfect if you want to follow along with this series of posts.

I’m still in my trial period, but given the positive experience thus far I would likely pay for the API in the future (especially since it will help me quickly create datasets for fun, hands-on deep learning PyImageSearch tutorials).

In next week’s blog post I’ll be demonstrating how to train a Convolutional Neural Network with Keras on top of the deep learning images we downloaded today. And in the final post in the series (coming in two weeks), I’ll show you how to deploy your Keras model to your smartphone (if possible — I’m still working out the kinks in the Keras + iOS integration).

This is a can’t miss series of posts, so don’t miss out! To be notified when the next post in the series goes live, just enter your email address in the form below.

Downloads:

If you would like to download the code and images used in this post, please enter your email address in the form below. Not only will you get a .zip of the code, I’ll also send you a FREE 11-page Resource Guide on Computer Vision and Image Search Engines, including exclusive techniques that I don’t post on this blog! Sound good? If so, enter your email address and I’ll send you the code immediately!

The post How to (quickly) build a deep learning image dataset appeared first on PyImageSearch.



from PyImageSearch https://ift.tt/2GKjYgB
via IFTTT

ISS Daily Summary Report – 4/06/2018

Mobile Servicing System (MSS) SpX-14 Operations: Overnight, the Robotics ground controllers released the Space Station Remote Manipulator System (SSRMS) from the SpaceX-14 Dragon Flight Releasable Grapple Fixture (FRGF).  They performed a survey of the Dragon trunk and then unstowed the Special Purpose Dexterous Manipulator (SPDM) from Mobile Base System (MBS) Power Data Grapple Fixture 2 … Continue reading "ISS Daily Summary Report – 4/06/2018"

from ISS On-Orbit Status Report https://ift.tt/2qjLpCC
via IFTTT

Here's how hackers are targeting Cisco Network Switches in Russia and Iran

Since last week, a new hacking group, calling itself 'JHT,' hijacked a significant number of Cisco devices belonging to organizations in Russia and Iran, and left a message that reads—"Do not mess with our elections" with an American flag (in ASCII art). MJ Azari Jahromi, Iranian Communication and Information Technology Minister, said the campaign impacted approximately 3,500 network switches


from The Hacker News https://ift.tt/2GMAMiT
via IFTTT

[FD] [RT-SA-2017-015] CyberArk Password Vault Memory Disclosure

Advisory: CyberArk Password Vault Memory Disclosure Data in the CyberArk Password Vault may be accessed through a proprietary network protocol. While answering to a client's logon request, the vault discloses around 50 bytes of its memory to the client. Details ======= Product: CyberArk Password Vault Affected Versions: < 9.7, < 10 Fixed Versions: 9.7, 10 Vulnerability Type: Information Disclosure Security Risk: high Vendor URL: https://www.cyberark.com/ Vendor Status: fixed version released Advisory URL: https://www.redteam-pentesting.de/advisories/rt-sa-2017-015 Advisory Status: published CVE: CVE-2018-9842 CVE URL: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-9842 Introduction ============ "CyberArk Enterprise Password Vault is designed to secure, rotate and control access to privileged account credentials based on organizational policies. A flexible architecture allows organizations to start small and scale to the largest, most complex IT environments. The solution protects privileged account credentials used to access the vast majority of systems." (from the Enterprise Password Vault Data Sheet [1]) More Details ============ The CyberArk Password Vault serves as a database to securely store credentials. Furthermore, the vault enforces access controls and logs access to its records. Data stored in the vault may be accessed through a proprietary network protocol which is usually transmitted over TCP port 1858. Various clients, such as web applications or command line tools, are provided by CyberArk to interface with a vault. The first message a client sends to the vault is a "Logon" command. Using a network sniffer, such a message was captured: $ xxd logon.bin 00000000: ffff ffff f700 0000 ffff ffff 3d01 0000 ............=... 00000010: 5061 636c 6953 6372 6970 7455 7365 7200 PacliScriptUser. 00000020: 0000 0000 0000 0000 0000 0000 0000 0000 ................ 00000030: 0000 0000 0000 0000 0000 0000 0000 0000 ................ 00000040: 0000 0000 0000 0000 0000 0000 0000 0000 ................ 00000050: 0000 0000 0000 0000 0000 0000 0000 0000 ................ 00000060: 0000 0000 0000 0000 0000 0000 0020 2020 ............. 00000070: 20ff ffff ff00 0000 0000 0000 0000 0073 ..............s 00000080: 0000 00ce cece ce00 0000 0000 0000 0000 ................ 00000090: 0000 0000 0000 0030 3d4c 6f67 6f6e fd31 .......0=Logon.1 000000a0: 3135 3d37 2e32 302e 3930 2e32 38fd 3639 15=7.20.90.28.69 000000b0: 3d50 fd31 3136 3d30 fd31 3030 3dfd 3231 =P.116=0.100=.21 000000c0: 373d 59fd 3231 383d 5041 434c 49fd 3231 7=Y.218=PACLI.21 000000d0: 393d fd33 3137 3d30 fd33 3537 3d30 fd32 9=.317=0.357=0.2 000000e0: 323d 5061 636c 6953 6372 6970 7455 7365 2=PacliScriptUse 000000f0: 72fd 3336 373d 3330 fd00 00 r.367=30... Starting at offset 0x97, a type of remote procedure call can be identified. In this case, "Logon" is invoked for the user "PacliScriptUser". This message does not contain any random, unpredictable data. Therefore, it may be replayed at will once captured. This can be accomplished using netcat:

Source: Gmail -> IFTTT-> Blogger

[FD] [RT-SA-2017-014] CyberArk Password Vault Web Access Remote Code Execution

Advisory: CyberArk Password Vault Web Access Remote Code Execution The CyberArk Password Vault Web Access application uses authentication tokens which consist of serialized .NET objects. By crafting manipulated tokens, attackers are able to gain unauthenticated remote code execution on the web server. Details ======= Product: CyberArk Password Vault Web Access Affected Versions: < 9.9.5, < 9.10, 10.1 Fixed Versions: 9.9.5, 9.10, 10.2 Vulnerability Type: Remote Code Execution Security Risk: high Vendor URL: https://www.cyberark.com/ Vendor Status: fixed version released Advisory URL: https://www.redteam-pentesting.de/advisories/rt-sa-2017-014 Advisory Status: published CVE: CVE-2018-9843 CVE URL: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-9843 Introduction ============ "CyberArk Enterprise Password Vault is designed to secure, rotate and control access to privileged account credentials based on organizational policies. A flexible architecture allows organizations to start small and scale to the largest, most complex IT environments. The solution protects privileged account credentials used to access the vast majority of systems." (from the Enterprise Password Vault Data Sheet [1]) More Details ============ The CyberArk Password Vault provides secure storage for credentials. It may be accessed through various clients which are also provided by CyberArk. One such client is the CyberArk Password Vault Web Access, a .NET web application. After logging into the web application with their credentials, users may access credentials kept in the vault. Additionally, CyberArk Password Vault Web Access provides a REST API for programmatic access to the vault. This API is available at an URL similar to the following: https://10.0.0.6/PasswordVault/WebServices/ The API provides multiple endpoints with different methods. Most methods provided by the API require prior authentication. Consequently, a user's API call must include an authentication token in an HTTP authorization header. Tokens may be generated by calling a dedicated "Logon" API method. Analysis of this token by RedTeam Pentesting revealed, that it consists of a base64 encoded, serialized .NET object of the type "CyberArk.Services.Web.SessionIdentifiers". This class consists of four string attributes which hold information about a user's session. The integrity of the serialized data is not protected. Therefore, attackers may send arbitrary .NET objects to the API in the authorization header. By leveraging certain gadgets, such as the ones provided by ysoserial.net [2], attackers may execute arbitrary code in the context of the web application. Proof of Concept ================ First, a malicious serialized .NET object is created. Here the "TypeConfuseDelegate" gadget of ysoserial.net is used to execute the "ping" command:

Source: Gmail -> IFTTT-> Blogger

The Sun Unleashed: Monster Filament in Ultraviolet


One of the most spectacular solar sights is an explosive flare. In 2011 June, the Sun unleashed somewhat impressive, medium-sized solar flare as rotation carried active regions of sunpots toward the solar limb. That flare, though, was followed by an astounding gush of magnetized plasma -- a monster filament seen erupting at the Sun's edge in this extreme ultraviolet image from NASA's Solar Dynamics Observatory. Featured here is a time-lapse video of that hours-long event showing darker, cooler plasma raining down across a broad area of the Sun's surface, arcing along otherwise invisible magnetic field lines. An associated coronal mass ejection, a massive cloud of high energy particles, was blasted in the general direction of the Earth,and made a glancing blow to Earth's magnetosphere. via NASA https://ift.tt/2qiokjX

Sunday, April 8, 2018

NGC 6960: The Witchs Broom Nebula


Ten thousand years ago, before the dawn of recorded human history, a new light would have suddenly have appeared in the night sky and faded after a few weeks. Today we know this light was from a supernova, or exploding star, and record the expanding debris cloud as the Veil Nebula, a supernova remnant. This sharp telescopic view is centered on a western segment of the Veil Nebula cataloged as NGC 6960 but less formally known as the Witch's Broom Nebula. Blasted out in the cataclysmic explosion, the interstellar shock wave plows through space sweeping up and exciting interstellar material. Imaged with narrow band filters, the glowing filaments are like long ripples in a sheet seen almost edge on, remarkably well separated into atomic hydrogen (red) and oxygen (blue-green) gas. The complete supernova remnant lies about 1400 light-years away towards the constellation Cygnus. This Witch's Broom actually spans about 35 light-years. The bright star in the frame is 52 Cygni, visible with the unaided eye from a dark location but unrelated to the ancient supernova remnant. via NASA https://ift.tt/2HgwHV1