|By Marketwired .||
|February 3, 2014 05:59 PM EST||
CALGARY, ALBERTA -- (Marketwired) -- 02/03/14 -- Graphite One Resources Inc. (TSX VENTURE:GPH)(OTCQX:GPHOF) ("Graphite One" or the "Company") is pleased to announce the receipt of a report issued by Fundamental Research Corp. The Fundamental Research Report is dated January 31, 2014 and the full report is available by contacting Fundamental Research as follows:
Sid Rajeev, MBA, CFA Fundamental Research Corp. Email: [email protected] Phone: +1 (604) 682-7065
About Graphite Creek
The Graphite Creek Property comprises 129 claims totaling 6,799 hectares on the Seward Peninsula of Alaska, 65 kilometers north (40 miles) of a deep sea port at Nome.
Mineralization at the Graphite Creek Property is characterized by coarse crystalline (large flake) graphite (greater than 80mesh) within graphite-bearing schist(s). Please refer to the January 20, 2014 press release where Graphite One reports a NI 43-101 inferred resource of 284.71 million tonnes at 4.5% graphite (including 37.68 million tonnes at 9.2% graphite and 8.63 million tonnes at 12.8% graphite).
About Graphite One Resources Inc.
GRAPHITE ONE RESOURCES INC. (TSX VENTURE:GPH)(OTCQX:GPHOF) is exploring with the intent to develop the Graphite Creek Project, USA's only advanced staged large-scale, large flake graphite deposit.
About Fundamental Research
Since 2003, Fundamental Research Corp has been providing the highest institutional quality equity research coverage on small and micro-cap companies through their extensive distribution network. In 2009, they also started issuing reports on exempt market securities. In 2010, their Top Picks earned them the #3 highest ranked analyst firm with a return of 20.75%, and #1 in the Basic Materials Sector (which includes mining) with a return of 29.07% on third party ranking systems that tracks analysts globally. Their goal from the beginning was to provide high quality research to a broad audience, while adhering to high ethical standards and a strong foundation of integrity.
ON BEHALF OF THE BOARD OF DIRECTORS
Anthony Huston, CEO, President & Director
Neither the TSX Venture Exchange nor its Regulation Services Provider (as that term is defined in the policies of the TSX Venture Exchange) accepts responsibility for the adequacy or accuracy of this release.
This release includes certain statements that may be deemed to be forward-looking statements. All statements in this release, other than statements of historical facts that address access to capital, regulatory approvals, exploration drilling, exploitation activities and events or developments that the Company expects, are forward-looking statements. Although the Company believes the expectations expressed in such forward-looking statements are based on reasonable assumptions, such statements are not guarantees of future performance and actual results or developments may differ materially from those in the forward-looking statements. Factors that could cause actual results to differ materially from those in forward-looking statements include market prices, exploitation and exploration successes, continuity of mineralization, uncertainties related to the ability to obtain necessary permits, licenses and title and delays due to third party opposition, changes in government policies regarding mining and natural resource exploration and exploitation, and continued availability of capital and financing, and general economic, market or business conditions. Readers are cautioned not to place undue reliance on this forward-looking information, which is given as of the date it is expressed in this press release, and the Company undertakes no obligation to update publicly or revise any forward-looking information, except as required by applicable securities laws. For more information on the Company, investors should review the Company's continuous disclosure filings that are available at www.sedar.com.
The mineral resource estimates reported in this press release were prepared in accordance with Canadian National Instrument 43-101 Standards of Disclosure for Mineral Projects ("NI 43-101"), as required by Canadian securities regulatory authorities. For United States reporting purposes, the United States Securities and Exchange Commission ("SEC") applies different standards in the classification of mineralization. In particular, while the terms "measured," "indicated" and "inferred" mineral resources are required pursuant to NI 43-101, the SEC does not recognize such terms. Canadian standards differ significantly from the requirements of the SEC. Investors are cautioned not to assume that any part or all of the mineral deposits in these categories constitute or will ever be converted into reserves. In addition, "inferred" mineral resources have a great amount of uncertainty as to their existence and great uncertainty as to their economic and legal feasibility. It cannot be assumed that all or any part of an inferred mineral resource will ever be upgraded to a higher category. Under Canadian securities laws, issuers must not make any disclosure of results of an economic analysis that includes inferred mineral resources, except in rare cases.
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Mar. 1, 2015 12:00 PM EST Reads: 1,868
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Mar. 1, 2015 12:00 PM EST Reads: 2,529
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
Mar. 1, 2015 12:00 PM EST Reads: 2,338
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Mar. 1, 2015 12:00 PM EST Reads: 1,207
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Mar. 1, 2015 11:00 AM EST Reads: 2,689
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Mar. 1, 2015 11:00 AM EST Reads: 6,929
Mar. 1, 2015 10:30 AM EST Reads: 2,514
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Mar. 1, 2015 10:00 AM EST Reads: 4,698
The cloud has transformed how we think about software quality. Instead of preventing failures, we must focus on automatic recovery from failure. In other words, resilience trumps traditional quality measures. Continuous delivery models further squeeze traditional notions of quality. Remember the venerable project management Iron Triangle? Among time, scope, and cost, you can only fix two or quality will suffer. Only in today's DevOps world, continuous testing, integration, and deployment upend...
Mar. 1, 2015 09:45 AM EST Reads: 2,245
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Mar. 1, 2015 09:45 AM EST Reads: 2,140
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, includin...
Mar. 1, 2015 09:45 AM EST Reads: 870
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Mar. 1, 2015 09:30 AM EST Reads: 1,644
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Mar. 1, 2015 09:00 AM EST Reads: 2,891
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Mar. 1, 2015 09:00 AM EST Reads: 2,678
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Mar. 1, 2015 09:00 AM EST Reads: 1,020